Skip to content

subashkonar13/ExtractTweets

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tutorial on Extracting data from twitter and feed it to Postgres after doing Sentimental Analysis and packaging it into docker

Description

This tutorial is intended for self learning and making use of Twitter client authentication and consumer token(compatible with ver 2 and 1.1 respectively)

Pre-requisites

  1. Install Docker Desktop.
  2. VSC/Any other interpreter.
  3. Dockerhub Account was created and Docker desktop installed(Windows)
  4. Twitter Developer account with Elevated Access and necessary tokens generated.

Modules

  1. collect_tweet_stream.py extracts the tweets
  2. app/db,app/writers folder contains python scripts that interacts with Postgres tables
  3. Adminer to view the data getting ingested
  4. VADER analyses the sentiment

Configuration

  1. Make changes in the credentials.json to add the Twitter credentials.
  2. Add the keyword to be searched in Tweets collect_tweets_stream.py at line 25

Execution

docker compose up alt text

Monitor

Access the Adminer link- http://localhost:8080/?pgsql=db to watch the rows getting added.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published