How-To : Write a Kafka Producer using Twitter Stream ( Twitter HBC Client)

Twitter opensourced it’s  Hosebird client (hbc) , a robust Java HTTP library for consuming Twitter’s Streaming API . In this post, I am going to present a demo of how we can use hbc to create a Kafka twitter stream producer , which tracks few terms on twitter statuses  and produces a kafka stream out of it, which can be utilized later for counting the terms, or putting that data from Kafka to Storm (Kafka-Storm pipeline ) or HDFS ( as we will see in next post for  using Camus API ).

You can download and run complete Sample here

Requirements :

  • Apache Kafka 0.8
  • Twitter Developer account ( for API Key, Secret etc.)
  • Apache Zookeeper ( required for Kafka)
  • Oracle JDK 1.7 (64 bit )

Build Environment :

  • Eclipse
  • Apache Maven 2/3

How to generate Twitter API Keys using Developer Account ?

  1. Go to https://apps.twitter.com/app/new  and log in, if necessary
  2. Enter your Application Name, Description and your website address. You can leave the callback URL empty.
  3. Accept the TOS.
  4. Submit the form by clicking the Create your Twitter Application
  5. Copy the consumer key (API key) and consumer secret from the screen into your application.
  6. After creating your Twitter Application, you have to give the access to your Twitter Account to use this Application. To do this, click the Create my Access Token.
  7. Now you will have Consumer Key, Consumer Secret, Acess token, Access Token Secret to be used in streaming API calls.

Steps to run the Sample  :

  1. Start Zookeeper server in Kafka using following script in your kafka installation folder  –

and, verify if it is running on default port 2181 using –

2. Start Kafka server using following script –

and, verify if it is running on default port 9092

3. Now, when we are all set with Kafka running ready to accept messages on any dynamically created topic ( default setting ), we will create a Kafka Producer , which makes use of hbc client API to get twitter stream for tracking terms and puts on topic named as “twitter-topic” .

  • First, we need to give maven dependencies for hbc-core for latest version and some other dependencies needed for Kafka –

  •  Then, we need to set properties to configure our Kafka Producer to publish messages to topic –
  • Set up a StatusFilterEndpoint , which will setup track terms to be tracked on recent status messages, as in the example, twitterapi and #AAPSweep ( change these to term you want to track) –
  • Provide authentication parameters for OAuth ( we are getting them using commandline parameters for this program ) for using twitter that we generated earlier and create the client using endpoint and auth –

  • Last step, connect to client, fetch messages from queue and send through Kafka Producer –

To run the complete example run TwitterKafkaProducer.java class as  a Java Application in  your favourite IDE.

Verify the Topic and Messages

  • Check if topic is there using –
  • Consume messages on topic twitter-topic to verify the incoming message stream.

     

    According to chao’s theory, you might face some issues in some of the steps mentioned above, but if you have reached this far, you have done an amazing job !!  🙂

Happy Learning !!

References :-

[1] https://kafka.apache.org/08/quickstart.html

[2] https://github.com/twitter/hbc

[3] https://themepacific.com/how-to-generate-api-key-consumer-token-access-key-for-twitter-oauth/994/

 

  • salgado777

    After i build the project how i start the app?

    • Patrick

      Same question, the author did not mention that. It’s the nightmare for newbies.

      • hi Patrick,

        I have updated the post to mention how to run it – To run the complete example run TwitterKafkaProducer class as a Java Application in your favourite IDE.

  • salgado777

    i used the same configuration but ihave this error :
    Exception in thread “main” java.lang.NoClassDefFoundError: com/twitter/hbc/core/endpoint/StreamingEndpoint

    • Hi, Did you try downloading the project from here – https://github.com/saurzcode/twitter-stream/ and running the same ? Also can you make sure when you are running this the HBC client maven dependency is available ?

      • salgado777

        yes i have downloaded from github and i run thé same and thé HBC client is available in pom.xml as you have mentioned
        I used another machine and it works well.

        • That’s great !!

        • Patrick

          I got the same error: cannot find Streaming End Point. How did you solve it? Just change a machine? Thanks.

          • Hi Patrick, It’s unfortunate you couldn’t run it. Changing machine is not the solution. It looks like the hbc dependency didn’t resolve through your maven configuration. Do you see any error in your pom.xml file? Also, Please let me know what commands do you use to run the class ? Do you run it from IDE or from command line ? If you are running it from command line, the most probable issue is the hbc jars are not in classpath.

  • lafouinez

    Hi,

    I try your source code. I have some problem with it. i can’t catch message.

    The last step doesn’t work for me especially this :

    message = new KeyedMessage(topic, queue.take());

    This returns nothing…

    Maybe you have some idea about it ?

    Thanks in advance

  • Pingback: How-To : Integrate Kafka with HDFS using Camus (Twitter Stream Example) - Saurz{Code}()