Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 41c401f

Browse files
"First reads from db then request" method added
1 parent d50a216 commit 41c401f

File tree

2 files changed

+36
-21
lines changed

2 files changed

+36
-21
lines changed

‎.gitignore‎

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -679,4 +679,5 @@ test.py
679679
Test/
680680
reddit_tokens.json
681681
scriptcopy.py
682-
.vscode
682+
.vscode
683+
db.json

‎Link-Preview/linkPreview.py‎

Lines changed: 34 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,14 @@ def getImage(soup, url):
6767
return res
6868

6969

70+
# print dictionary
71+
def printData(data):
72+
print("\nTitle : ", data["title"])
73+
print("Description : ", data["description"])
74+
print("URL : ", data["url"])
75+
print("Image link : ", data["image"])
76+
77+
7078
# start
7179
print("\n======================")
7280
print("- Link Preview -")
@@ -91,25 +99,31 @@ def getImage(soup, url):
9199
f.write("{}")
92100
f.close()
93101

102+
# read db
94103
with open('Link-Preview/db.json', 'r') as file:
95104
db = json.loads(file.read())
96-
db["mj"] = {
97-
"name": "madhav"
98-
}
99-
print(db)
100-
101-
# parse file
102-
with open('Link-Preview/db.json', 'w') as file:
103-
json.dump(db, file)
104-
105-
# if not in db get via request
106-
107-
# getting the html
108-
# r = requests.get(url)
109-
# soup = BeautifulSoup(r.text, "html.parser")
110-
111-
# print("\nTitle : ", getTitle(soup))
112-
# print("Description : ", getDesc(soup))
113-
# print("URL : ", url)
114-
# print("Image link : ", getImage(soup, url))
115-
# print("\n--END--\n")
105+
106+
# check if it exists
107+
if (url in db):
108+
print(db[url])
109+
else:
110+
# if not in db get via request
111+
112+
# getting the html
113+
r = requests.get(url)
114+
soup = BeautifulSoup(r.text, "html.parser")
115+
116+
# printing data
117+
newData = {
118+
"title": getTitle(soup),
119+
"description": getDesc(soup),
120+
"url": url,
121+
"image": getImage(soup, url)
122+
}
123+
printData(newData)
124+
# parse file
125+
db[url] = newData
126+
with open('Link-Preview/db.json', 'w') as file:
127+
json.dump(db, file)
128+
129+
print("\n--END--\n")

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /