Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 8fafc50

Browse files
Merge pull request avinashkranjan#1794 from Mahitej28/quote-generator
[FEATURE]Added Random quote generator script
2 parents e79d444 + 92b27f5 commit 8fafc50

File tree

5 files changed

+76
-31
lines changed

5 files changed

+76
-31
lines changed
Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
1-
# LINK SCRAPPER
2-
3-
4-
- It is used to scrape links from any website and display it.
5-
6-
7-
## Setup instructions
8-
9-
Any PC with python 3 installed can run this code.
10-
11-
12-
## Output
13-
image.png
14-
15-
## Author(s)
16-
1+
# LINK SCRAPPER
2+
3+
4+
- It is used to scrape links from any website and display it.
5+
6+
7+
## Setup instructions
8+
9+
Any PC with python 3 installed can run this code.
10+
11+
12+
## Output
13+
image.png
14+
15+
## Author(s)
16+
1717
Tanya Mohanka
Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,15 @@
1-
import requests
2-
from bs4 import BeautifulSoup
3-
4-
def scrape_links(url):
5-
response = requests.get(url)
6-
soup = BeautifulSoup(response.text, 'html.parser')
7-
links = soup.find_all('a')
8-
for link in links:
9-
href = link.get('href')
10-
if href and href.startswith('http'): # Filter out non-HTTP links
11-
print(href)
12-
13-
# Example usage:
14-
url = 'https://www.linkedin.com/feed/' # Replace with the URL of the website you want to scrape
15-
scrape_links(url)
1+
import requests
2+
from bs4 import BeautifulSoup
3+
4+
def scrape_links(url):
5+
response = requests.get(url)
6+
soup = BeautifulSoup(response.text, 'html.parser')
7+
links = soup.find_all('a')
8+
for link in links:
9+
href = link.get('href')
10+
if href and href.startswith('http'): # Filter out non-HTTP links
11+
print(href)
12+
13+
# Example usage:
14+
url = 'https://www.linkedin.com/feed/' # Replace with the URL of the website you want to scrape
15+
scrape_links(url)

‎Random-Quote-generator/README.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# Package/Script Name
2+
3+
- `requests` library is used, to make HTTP requests in Python
4+
- `quote.py` script is used to generate random quotes by fetching them from public API endpoint
5+
6+
## Setup instructions
7+
8+
- Install python from [here](https://www.python.org/.)
9+
- Install requests library using command : `pip install requests`
10+
11+
## Detailed explanation of script, if needed
12+
13+
The function `generate_quote()` make GET request to the public API endpoint, extracts the data in json format and displays in randomly to the user. It handles error if any in the `else` condition.
14+
15+
## Output
16+
17+
![Output](image.png)
18+
19+
In text Format the output would be as follows : <br>
20+
Arnold Schwarzenegger - If you want to turn a vision into reality, you have to give 100% and never stop believing in your dream.
21+
22+
23+
## Author(s)
24+
25+
[Mahima Churi](https://github.com/Mahitej28)
26+
27+
## Disclaimers, if any
28+
29+
Any puplic API can be used for generating quotes, also datasets from Kaggle can also be imported and used.

‎Random-Quote-generator/image.png

25.6 KB
Loading[フレーム]

‎Random-Quote-generator/quote.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
import requests
2+
3+
def generate_quote():
4+
response = requests.get("https://zenquotes.io/api/random")
5+
if response.status_code == 200:
6+
data = response.json()
7+
quote = data[0]['q']
8+
author = data[0]['a']
9+
10+
return f'{author} - {quote}'
11+
12+
else:
13+
return "Failed to fetch a quote"
14+
15+
# Generate and print a random quote
16+
print(generate_quote())

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /