Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit a8e50f7

Browse files
Merge pull request avinashkranjan#2387 from Kalivarapubindusree/l9
Linkedin posts scrapper added
2 parents 962a3e7 + 93b4659 commit a8e50f7

File tree

2 files changed

+43
-0
lines changed

2 files changed

+43
-0
lines changed
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
import requests
2+
3+
def get_linkedin_posts(access_token):
4+
url = 'https://api.linkedin.com/v2/shares?q=owners&owners=urn:li:person:YOUR_USER_ID&count=10'
5+
headers = {'Authorization': f'Bearer {access_token}'}
6+
7+
response = requests.get(url, headers=headers)
8+
data = response.json()
9+
10+
if response.status_code == 200:
11+
posts = data.get('elements', [])
12+
for post in posts:
13+
post_text = post.get('text', '')
14+
print(post_text)
15+
else:
16+
print(f"Error: {response.status_code} - {data.get('message', 'Unknown error')}")
17+
18+
if __name__ == "__main__":
19+
# Replace YOUR_ACCESS_TOKEN and YOUR_USER_ID with appropriate values
20+
access_token = "YOUR_ACCESS_TOKEN"
21+
get_linkedin_posts(access_token)

‎Linkedin_Posts_Scrapper/README.md‎

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# LinkedIn Posts Scrapper
2+
3+
It is an automated script to scrap LinkedIn Posts, and number of Reactions and Comments from the `` /detail/recent-activity/shares/ `` endpoint.
4+
5+
# Installation
6+
7+
* Make sure you have the following Python libraries:
8+
> pip3 install selenium pandas
9+
10+
* Place `` CromeDriver.exe `` in the same directory of the script. You can download it from [here](https://sites.google.com/a/chromium.org/chromedriver/downloads) <br>
11+
(Note: Download the one with the same version of your Chrome browser.)
12+
13+
# Usage
14+
15+
> python scrapper.py -e \<email\> -p \<password\> -n \<number-of-posts-to-scrap\>
16+
17+
# Output
18+
19+
The script should output a `` Scrap.csv `` file containing the scrapped posts whose columns are in the following order: Heading, Reactions, Comments
20+
21+
Note: If post has a 0 reaction or comment, the output will be substituted with None
22+

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /