Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 950e106

Browse files
Create Get1Countries.py
0 parents commit 950e106

File tree

1 file changed

+50
-0
lines changed

1 file changed

+50
-0
lines changed

‎Code/Get1Countries.py

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
#################################################
2+
### 1. GET THE NAMES OF THE COUNTRIES ###
3+
### FROM THE WEBSITE ###
4+
#################################################
5+
6+
# The result of this codes does not exceed 100 pages
7+
# Therefore a boolean parameter to stop the scrapping
8+
# is not set.
9+
10+
# Author of Code: Lashari Gochiashvili
11+
# Load main packages and libraries
12+
from selenium import webdriver
13+
from selenium.webdriver.common.by import By
14+
import time
15+
import csv
16+
from selenium.webdriver.support.ui import WebDriverWait
17+
from selenium.webdriver.support import expected_conditions as EC
18+
19+
# Webdriver settings
20+
gecko_path = 'C:/Users/Lasha/anaconda3/geckodriver.exe'
21+
22+
options = webdriver.firefox.options.Options()
23+
options.headless = False
24+
driver = webdriver.Firefox(options = options, executable_path = gecko_path)
25+
26+
url = 'https://openaq.org/#/countries?_k=gv4bjc'
27+
driver.get(url)
28+
wait = WebDriverWait(driver, 5)
29+
driver.implicitly_wait(5)
30+
31+
# Scrapping country names (e.g. Afghanistan, Austria, etc) using class name
32+
# We will need country names to generate links to each country page
33+
# We will generate links at the second stage.
34+
wait.until(EC.presence_of_element_located((By.CLASS_NAME, 'card__title')))
35+
countries = driver.find_elements_by_class_name('card__title')
36+
37+
# This function creates a list of countries after scrapping country names
38+
# Also, it saves a list of countries in .csv file
39+
# We will need .csv file with the country names for the second stage.
40+
list_of_countries = []
41+
for country in countries:
42+
list_of_countries.append(country.text)
43+
f = open('1Countries.csv','w', newline='')
44+
with f:
45+
writer = csv.writer(f)
46+
writer.writerow(list_of_countries)
47+
48+
# Closing web browser
49+
time.sleep(2)
50+
driver.quit()

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /