Spaces:
Sleeping
Sleeping
Initial commit: Google_jobs has the main functions that pulls jobs from google search.
Browse filesjob_desc_pydantic.py holds all of the pydantic objects that define what's extracted from the job descriptions.
the google_job_rwtest notebook
Readme is updated to reflect the file changes.
- README.md +63 -2
- google_jobs.py +93 -0
- job_desc_pydantic.py +155 -0
- notebooks/google_job_rwtest.ipynb +1376 -0
README.md
CHANGED
@@ -1,2 +1,63 @@
|
|
1 |
-
#
|
2 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# 🚀 Data Job Insights: Unlock Your Career Potential! 📊
|
2 |
+
|
3 |
+
Are you ready to take your data career to the next level? Look no further! This project is here to revolutionize the way you approach your job search and resume optimization. By leveraging the power of an advanced Language Model (LLM), we've created a game-changing tool that will give you the inside scoop on what employers are really looking for in data-related roles. 💡
|
4 |
+
|
5 |
+
## 🎯 What's the Goal?
|
6 |
+
|
7 |
+
Our mission is to build a comprehensive database of job descriptions for various data-related positions, including:
|
8 |
+
|
9 |
+
- Data Scientist 🔬
|
10 |
+
- Data Engineer ⚙️
|
11 |
+
- Data Analyst 📈
|
12 |
+
- Machine Learning Engineer 🤖
|
13 |
+
- And more! 💼
|
14 |
+
|
15 |
+
But we're not just collecting job postings – we're taking it to the next level! Our cutting-edge LLM will parse each job description and extract key information, giving you unparalleled insights into what skills, experiences, and qualities are most sought-after in the industry. 🔍
|
16 |
+
|
17 |
+
## 🛠️ How Does It Work?
|
18 |
+
|
19 |
+
We've designed a robust system using Pydantic models to capture and structure the most important aspects of each job description. Here's a quick overview of what we're extracting:
|
20 |
+
|
21 |
+
### 🏢 Company Overview
|
22 |
+
|
23 |
+
- About the company, its industry, products, and services
|
24 |
+
- Mission, values, and culture
|
25 |
+
- Company size and locations
|
26 |
+
|
27 |
+
### 📝 Role Summary
|
28 |
+
|
29 |
+
- Job title and team/department
|
30 |
+
- Role type (full-time, part-time, contract, etc.)
|
31 |
+
- Remote work options
|
32 |
+
|
33 |
+
### 📋 Responsibilities and Qualifications
|
34 |
+
|
35 |
+
- Core duties and expectations
|
36 |
+
- Required educational background and experience
|
37 |
+
- Preferred skills and characteristics
|
38 |
+
|
39 |
+
### 💰 Compensation and Benefits
|
40 |
+
|
41 |
+
- Salary or pay range
|
42 |
+
- Bonus and equity compensation
|
43 |
+
- Benefits and perks
|
44 |
+
|
45 |
+
With this wealth of information at your fingertips, you'll be able to tailor your resume and interview responses to perfectly match what employers are looking for! 🎯
|
46 |
+
|
47 |
+
## 🔮 What's Next?
|
48 |
+
|
49 |
+
We're just getting started! In the near future, we'll be adding even more powerful features to help you land your dream data job:
|
50 |
+
|
51 |
+
- [ ] Automatically categorize job descriptions based on key characteristics
|
52 |
+
- [ ] Provide personalized resume suggestions based on your target roles
|
53 |
+
- [ ] Offer insider tips and strategies for acing data job interviews
|
54 |
+
|
55 |
+
Stay tuned for these exciting updates! 🚀
|
56 |
+
|
57 |
+
## 🤝 Get Involved!
|
58 |
+
|
59 |
+
We believe in the power of community and collaboration. If you're passionate about data careers and want to contribute to this project, we'd love to have you on board! Whether you're a data professional, a machine learning enthusiast, or just excited about the potential of this tool, there's a place for you here.
|
60 |
+
|
61 |
+
Feel free to submit pull requests, suggest new features, or simply share your experiences and insights. Together, we can help data professionals everywhere unlock their full career potential! 💪
|
62 |
+
|
63 |
+
So what are you waiting for? Dive in, explore the codebase, and let's revolutionize the world of data jobs together! 🌟
|
google_jobs.py
ADDED
@@ -0,0 +1,93 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import pandas as pd
|
2 |
+
# from serpapi import GoogleSearch
|
3 |
+
import sqlite3
|
4 |
+
import datetime as dt
|
5 |
+
import http.client
|
6 |
+
import json
|
7 |
+
import config
|
8 |
+
import urllib.parse
|
9 |
+
import os
|
10 |
+
from sqlalchemy import create_engine
|
11 |
+
from concurrent.futures import ThreadPoolExecutor, as_completed
|
12 |
+
from dotenv import load_dotenv
|
13 |
+
load_dotenv()
|
14 |
+
|
15 |
+
|
16 |
+
def google_job_search(job_title, city_state, start=0):
|
17 |
+
'''
|
18 |
+
job_title(str): "Data Scientist", "Data Analyst"
|
19 |
+
city_state(str): "Denver, CO"
|
20 |
+
post_age,(str)(optional): "3day", "week", "month"
|
21 |
+
'''
|
22 |
+
query = f"{job_title} {city_state}"
|
23 |
+
params = {
|
24 |
+
"api_key": os.getenv('SerpAPIkey'),
|
25 |
+
"engine": "google_jobs",
|
26 |
+
"q": query,
|
27 |
+
"hl": "en",
|
28 |
+
"start": start,
|
29 |
+
# "chips": f"date_posted:{post_age}",
|
30 |
+
}
|
31 |
+
|
32 |
+
query_string = urllib.parse.urlencode(params, quote_via=urllib.parse.quote)
|
33 |
+
|
34 |
+
conn = http.client.HTTPSConnection("serpapi.webscrapingapi.com")
|
35 |
+
try:
|
36 |
+
conn.request("GET", f"/v1?{query_string}")
|
37 |
+
res = conn.getresponse()
|
38 |
+
try:
|
39 |
+
data = res.read()
|
40 |
+
finally:
|
41 |
+
res.close()
|
42 |
+
finally:
|
43 |
+
conn.close()
|
44 |
+
|
45 |
+
try:
|
46 |
+
json_data = json.loads(data.decode("utf-8"))
|
47 |
+
jobs_results = json_data['google_jobs_results']
|
48 |
+
job_columns = ['title', 'company_name', 'location', 'description', 'extensions', 'job_id']
|
49 |
+
df = pd.DataFrame(jobs_results, columns=job_columns)
|
50 |
+
return df
|
51 |
+
except (KeyError, json.JSONDecodeError) as e:
|
52 |
+
print(f"Error occurred for search: {job_title} in {city_state}")
|
53 |
+
print(f"Error message: {str(e)}")
|
54 |
+
return None
|
55 |
+
|
56 |
+
def sql_dump(df, table):
|
57 |
+
engine = create_engine(f"postgresql://{os.getenv('MasterName')}:{os.getenv('MasterPass')}@{os.getenv('RDS_EndPoint')}:5432/postgres")
|
58 |
+
with engine.connect() as conn:
|
59 |
+
df.to_sql(table, conn, if_exists='append', chunksize=20, method='multi', index=False)
|
60 |
+
print(f"Dumped {df.shape} to SQL table {table}")
|
61 |
+
|
62 |
+
def process_batch(job, city_state, start):
|
63 |
+
df_10jobs = google_job_search(job, city_state, start)
|
64 |
+
if df_10jobs is not None:
|
65 |
+
print(f'City: {city_state} Job: {job} Start: {start}')
|
66 |
+
print(df_10jobs.shape)
|
67 |
+
date = dt.datetime.today().strftime('%Y-%m-%d')
|
68 |
+
df_10jobs['retrieve_date'] = date
|
69 |
+
df_10jobs.drop_duplicates(subset=['job_id', 'company_name'], inplace=True)
|
70 |
+
rows_affected = sql_dump(df_10jobs, 'usajobstest')
|
71 |
+
print(f"Rows affected: {rows_affected}")
|
72 |
+
|
73 |
+
def main(job_list, city_state_list):
|
74 |
+
with ThreadPoolExecutor() as executor:
|
75 |
+
futures = []
|
76 |
+
for job in job_list:
|
77 |
+
for city_state in city_state_list:
|
78 |
+
for start in range(0, 1):
|
79 |
+
future = executor.submit(process_batch, job, city_state, start)
|
80 |
+
futures.append(future)
|
81 |
+
|
82 |
+
for future in as_completed(futures):
|
83 |
+
future.result()
|
84 |
+
|
85 |
+
if __name__ == "__main__":
|
86 |
+
job_list = ["Data Scientist", "Machine Learning Engineer", "AI Gen Engineer",
|
87 |
+
"Data Analyst", "Data Engineer", "Business Intelligence Analyst"]
|
88 |
+
city_state_list = ["Atlanta, GA", "Austin, TX", "Boston, MA", "Chicago, IL",
|
89 |
+
"Denver CO", "Dallas-Ft. Worth, TX", "Los Angeles, CA",
|
90 |
+
"New York City NY", "San Francisco, CA", "Seattle, WA",
|
91 |
+
"Palo Alto CA", "Mountain View CA"]
|
92 |
+
simple_city_state_list: list[str] = ["Palo Alto CA", "San Francisco CA", ]
|
93 |
+
main(job_list, city_state_list)
|
job_desc_pydantic.py
ADDED
@@ -0,0 +1,155 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from typing import List, Optional
|
2 |
+
from langchain_core.pydantic_v1 import BaseModel, Field
|
3 |
+
|
4 |
+
class CompanyOverview(BaseModel):
|
5 |
+
"""
|
6 |
+
A model for capturing key information about the company offering the job.
|
7 |
+
|
8 |
+
Extract relevant details about the company from the job description,
|
9 |
+
including a brief overview of its industry and products, its mission and
|
10 |
+
values, size, and location(s).
|
11 |
+
|
12 |
+
Focus on capturing the most salient points that give a well-rounded picture
|
13 |
+
of the company and its culture.
|
14 |
+
"""
|
15 |
+
|
16 |
+
about: Optional[str] = Field(
|
17 |
+
None,
|
18 |
+
description="""Brief description of the company, its industry, products, services,
|
19 |
+
and any notable achievements or differentiators"""
|
20 |
+
)
|
21 |
+
|
22 |
+
mission_and_values: Optional[str] = Field(
|
23 |
+
None,
|
24 |
+
description="""Company mission, vision, values, and culture, including commitments
|
25 |
+
to diversity, inclusion, social responsibility, and work-life balance"""
|
26 |
+
)
|
27 |
+
|
28 |
+
size: Optional[str] = Field(
|
29 |
+
None,
|
30 |
+
description="Details about company size, such as number of employees")
|
31 |
+
|
32 |
+
locations: Optional[str] = Field(
|
33 |
+
None,
|
34 |
+
description="""Geographic presence of the company, including headquarters,
|
35 |
+
offices, and any remote work options"""
|
36 |
+
)
|
37 |
+
|
38 |
+
city: Optional[str] = Field(None, description="City where the company is located")
|
39 |
+
|
40 |
+
state: Optional[str] = Field(None, description="State where the company is located")
|
41 |
+
|
42 |
+
|
43 |
+
class RoleSummary(BaseModel):
|
44 |
+
"""
|
45 |
+
A model for capturing the key summary points about the job role.
|
46 |
+
|
47 |
+
Extract the essential high-level details about the role from the job description,
|
48 |
+
such as the job title, the team or department the role belongs to, the role type,
|
49 |
+
and any remote work options.
|
50 |
+
|
51 |
+
Prioritize information that helps understand the overall scope and positioning
|
52 |
+
of the role within the company.
|
53 |
+
"""
|
54 |
+
|
55 |
+
title: str = Field(..., description="Title of the job role")
|
56 |
+
|
57 |
+
team_or_department: Optional[str] = Field(
|
58 |
+
None,
|
59 |
+
description="""Team, department, or business unit the role belongs to,
|
60 |
+
including any collaborations with other teams"""
|
61 |
+
)
|
62 |
+
|
63 |
+
role_type: Optional[str] = Field(
|
64 |
+
None,
|
65 |
+
description="Type of role (full-time, part-time, contract, etc.)"
|
66 |
+
)
|
67 |
+
|
68 |
+
remote: Optional[str] = Field(
|
69 |
+
None,
|
70 |
+
description="Remote work options for the role (full, hybrid, none)"
|
71 |
+
)
|
72 |
+
|
73 |
+
class ResponsibilitiesAndQualifications(BaseModel):
|
74 |
+
"""
|
75 |
+
A model for capturing the key responsibilities, requirements, and preferred
|
76 |
+
qualifications for the job role.
|
77 |
+
|
78 |
+
Extract the essential duties and expectations of the role, the mandatory
|
79 |
+
educational background and experience required, and any additional skills
|
80 |
+
or characteristics that are desirable but not strictly necessary.
|
81 |
+
|
82 |
+
The goal is to provide a clear and comprehensive picture of what the role
|
83 |
+
entails and what qualifications the ideal candidate should possess.
|
84 |
+
"""
|
85 |
+
|
86 |
+
responsibilities: List[str] = Field(
|
87 |
+
description="""The core duties, tasks, and expectations of the role, encompassing
|
88 |
+
areas such as metrics, theories, business understanding, product
|
89 |
+
direction, systems, leadership, decision making, strategy, and
|
90 |
+
collaboration, as described in the job description"""
|
91 |
+
)
|
92 |
+
|
93 |
+
required_qualifications: List[str] = Field(
|
94 |
+
description="""The essential educational qualifications (e.g., Doctorate,
|
95 |
+
Master's, Bachelor's degrees in specific fields) and years of
|
96 |
+
relevant professional experience that are mandatory for the role,
|
97 |
+
including any alternative acceptable combinations of education
|
98 |
+
and experience, as specified in the job description"""
|
99 |
+
)
|
100 |
+
|
101 |
+
preferred_qualifications: List[str] = Field(
|
102 |
+
description="""Any additional skills, experiences, characteristics, or domain
|
103 |
+
expertise that are valuable for the role but not absolute
|
104 |
+
requirements, such as proficiency with specific tools/technologies,
|
105 |
+
relevant soft skills, problem solving abilities, and industry
|
106 |
+
knowledge, as mentioned in the job description as preferred or
|
107 |
+
nice-to-have qualifications"""
|
108 |
+
)
|
109 |
+
|
110 |
+
class CompensationAndBenefits(BaseModel):
|
111 |
+
"""
|
112 |
+
A model for capturing the compensation and benefits package for the job role.
|
113 |
+
|
114 |
+
Extract details about the salary or pay range, bonus and equity compensation,
|
115 |
+
benefits, and perks from the job description.
|
116 |
+
|
117 |
+
Aim to provide a comprehensive view of the total rewards offered for the role,
|
118 |
+
including both monetary compensation and non-monetary benefits and perks.
|
119 |
+
"""
|
120 |
+
|
121 |
+
salary_or_pay_range: Optional[str] = Field(
|
122 |
+
None,
|
123 |
+
description="""The salary range or hourly pay range for the role, including
|
124 |
+
any specific numbers or bands mentioned in the job description"""
|
125 |
+
)
|
126 |
+
|
127 |
+
bonus_and_equity: Optional[str] = Field(
|
128 |
+
None,
|
129 |
+
description="""Any information about bonus compensation, such as signing bonuses,
|
130 |
+
annual performance bonuses, or other incentives, as well as details
|
131 |
+
about equity compensation like stock options or RSUs"""
|
132 |
+
)
|
133 |
+
|
134 |
+
benefits: Optional[List[str]] = Field(
|
135 |
+
None,
|
136 |
+
description="""A list of benefits offered for the role, such as health insurance,
|
137 |
+
dental and vision coverage, retirement plans (401k, pension), paid
|
138 |
+
time off (vacation, sick days, holidays), parental leave, and any
|
139 |
+
other standard benefits mentioned in the job description"""
|
140 |
+
)
|
141 |
+
|
142 |
+
perks: Optional[List[str]] = Field(
|
143 |
+
None,
|
144 |
+
description="""A list of additional perks and amenities offered, such as free food
|
145 |
+
or snacks, commuter benefits, wellness programs, learning and development
|
146 |
+
stipends, employee discounts, or any other unique perks the company
|
147 |
+
provides to its employees, as mentioned in the job description"""
|
148 |
+
)
|
149 |
+
|
150 |
+
class JobDescription(BaseModel):
|
151 |
+
"""Extracted information from a job description."""
|
152 |
+
company_overview: CompanyOverview
|
153 |
+
role_summary: RoleSummary
|
154 |
+
responsibilities_and_qualifications: ResponsibilitiesAndQualifications
|
155 |
+
compensation_and_benefits: CompensationAndBenefits
|
notebooks/google_job_rwtest.ipynb
ADDED
@@ -0,0 +1,1376 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"cells": [
|
3 |
+
{
|
4 |
+
"cell_type": "markdown",
|
5 |
+
"metadata": {},
|
6 |
+
"source": [
|
7 |
+
"## Job description from google jobs"
|
8 |
+
]
|
9 |
+
},
|
10 |
+
{
|
11 |
+
"cell_type": "code",
|
12 |
+
"execution_count": 1,
|
13 |
+
"metadata": {},
|
14 |
+
"outputs": [],
|
15 |
+
"source": [
|
16 |
+
"# import pandas as pd\n",
|
17 |
+
"# # from serpapi import GoogleSearch\n",
|
18 |
+
"# import sqlite3\n",
|
19 |
+
"# import datetime as dt\n",
|
20 |
+
"# import http.client\n",
|
21 |
+
"# import json\n",
|
22 |
+
"# import config\n",
|
23 |
+
"# import urllib.parse\n",
|
24 |
+
"# import os\n",
|
25 |
+
"# from sqlalchemy import create_engine\n",
|
26 |
+
"# import psycopg2\n",
|
27 |
+
"\n",
|
28 |
+
"# def google_job_search(job_title, city_state):\n",
|
29 |
+
"# '''\n",
|
30 |
+
"# job_title(str): \"Data Scientist\", \"Data Analyst\"\n",
|
31 |
+
"# city_state(str): \"Denver, CO\"\n",
|
32 |
+
"# post_age,(str)(optional): \"3day\", \"week\", \"month\"\n",
|
33 |
+
"# '''\n",
|
34 |
+
"# query = f\"{job_title} {city_state}\"\n",
|
35 |
+
"# params = {\n",
|
36 |
+
"# \"engine\": \"google_jobs\",\n",
|
37 |
+
"# \"q\": query,\n",
|
38 |
+
"# \"hl\": \"en\",\n",
|
39 |
+
"# \"api_key\": os.getenv('SerpAPIkey'),\n",
|
40 |
+
"# # \"chips\": f\"date_posted:{post_age}\",\n",
|
41 |
+
"# }\n",
|
42 |
+
"\n",
|
43 |
+
"# query_string = urllib.parse.urlencode(params, quote_via=urllib.parse.quote)\n",
|
44 |
+
"\n",
|
45 |
+
"# conn = http.client.HTTPSConnection(\"serpapi.webscrapingapi.com/v1\")\n",
|
46 |
+
"# try:\n",
|
47 |
+
"# conn.request(\"GET\", f\"/v1?{query_string}\")\n",
|
48 |
+
"# res = conn.getresponse()\n",
|
49 |
+
"# try:\n",
|
50 |
+
"# data = res.read()\n",
|
51 |
+
"# finally:\n",
|
52 |
+
"# res.close()\n",
|
53 |
+
"# finally:\n",
|
54 |
+
"# conn.close()\n",
|
55 |
+
"\n",
|
56 |
+
"# try:\n",
|
57 |
+
"# json_data = json.loads(data.decode(\"utf-8\"))\n",
|
58 |
+
"# jobs_results = json_data['google_jobs_results']\n",
|
59 |
+
"# job_columns = ['title', 'company_name', 'location', 'description']\n",
|
60 |
+
"# df = pd.DataFrame(jobs_results, columns=job_columns)\n",
|
61 |
+
"# return df\n",
|
62 |
+
"# except (KeyError, json.JSONDecodeError) as e:\n",
|
63 |
+
"# print(f\"Error occurred for search: {job_title} in {city_state}\")\n",
|
64 |
+
"# print(f\"Error message: {str(e)}\")\n",
|
65 |
+
"# return None\n",
|
66 |
+
"\n",
|
67 |
+
"# def sql_dump(df, table):\n",
|
68 |
+
"# engine = create_engine(f\"postgresql://{os.getenv('MasterName')}:{os.getenv('MasterPass')}@{os.getenv('RDS_EndPoint')}:5432/postgres\")\n",
|
69 |
+
"# with engine.connect() as conn:\n",
|
70 |
+
"# df.to_sql(table, conn, if_exists='append', chunksize=1000, method='multi', index=False)\n",
|
71 |
+
"\n",
|
72 |
+
"# def main(job_list, city_state_list):\n",
|
73 |
+
"# for job in job_list:\n",
|
74 |
+
"# for city_state in city_state_list:\n",
|
75 |
+
"# df_10jobs = google_job_search(job, city_state)\n",
|
76 |
+
"# if df_10jobs is not None:\n",
|
77 |
+
"# print(f'City: {city_state} Job: {job}')\n",
|
78 |
+
"# print(df_10jobs.shape)\n",
|
79 |
+
"# date = dt.datetime.today().strftime('%Y-%m-%d')\n",
|
80 |
+
"# df_10jobs['retrieve_date'] = date\n",
|
81 |
+
"# sql_dump(df_10jobs, 'datajobs24')\n",
|
82 |
+
"\n",
|
83 |
+
"# return None"
|
84 |
+
]
|
85 |
+
},
|
86 |
+
{
|
87 |
+
"cell_type": "code",
|
88 |
+
"execution_count": 2,
|
89 |
+
"metadata": {},
|
90 |
+
"outputs": [],
|
91 |
+
"source": [
|
92 |
+
"# job_list= [\"Machine Learning Engineer\", \"Data Scientist\", \"Generative AI Engineer\", \"Solutions Engineer\", \"LLM Engineer\"]\n",
|
93 |
+
"# simple_city_state_list= [\"Menlo Park CA\", \"Palo Alto CA\", \"San Francisco CA\", \"Mountain View CA\"]\n",
|
94 |
+
"# sample = main(job_list, simple_city_state_list)"
|
95 |
+
]
|
96 |
+
},
|
97 |
+
{
|
98 |
+
"cell_type": "code",
|
99 |
+
"execution_count": 1,
|
100 |
+
"metadata": {},
|
101 |
+
"outputs": [],
|
102 |
+
"source": [
|
103 |
+
"import pandas as pd\n",
|
104 |
+
"# from serpapi import GoogleSearch\n",
|
105 |
+
"import sqlite3\n",
|
106 |
+
"import datetime as dt\n",
|
107 |
+
"import http.client\n",
|
108 |
+
"import json\n",
|
109 |
+
"import config\n",
|
110 |
+
"import urllib.parse\n",
|
111 |
+
"import os\n",
|
112 |
+
"from sqlalchemy import create_engine\n",
|
113 |
+
"from concurrent.futures import ThreadPoolExecutor, as_completed\n",
|
114 |
+
"\n",
|
115 |
+
"def google_job_search(job_title, city_state, start=0):\n",
|
116 |
+
" '''\n",
|
117 |
+
" job_title(str): \"Data Scientist\", \"Data Analyst\"\n",
|
118 |
+
" city_state(str): \"Denver, CO\"\n",
|
119 |
+
" post_age,(str)(optional): \"3day\", \"week\", \"month\"\n",
|
120 |
+
" '''\n",
|
121 |
+
" query = f\"{job_title} {city_state}\"\n",
|
122 |
+
" params = {\n",
|
123 |
+
" \"api_key\": os.getenv('SerpAPIkey'),\n",
|
124 |
+
" \"engine\": \"google_jobs\",\n",
|
125 |
+
" \"q\": query,\n",
|
126 |
+
" \"hl\": \"en\",\n",
|
127 |
+
" \"start\": start,\n",
|
128 |
+
" # \"chips\": f\"date_posted:{post_age}\",\n",
|
129 |
+
" }\n",
|
130 |
+
"\n",
|
131 |
+
" query_string = urllib.parse.urlencode(params, quote_via=urllib.parse.quote)\n",
|
132 |
+
"\n",
|
133 |
+
" conn = http.client.HTTPSConnection(\"serpapi.webscrapingapi.com\")\n",
|
134 |
+
" try:\n",
|
135 |
+
" conn.request(\"GET\", f\"/v1?{query_string}\")\n",
|
136 |
+
" res = conn.getresponse()\n",
|
137 |
+
" try:\n",
|
138 |
+
" data = res.read()\n",
|
139 |
+
" finally:\n",
|
140 |
+
" res.close()\n",
|
141 |
+
" finally:\n",
|
142 |
+
" conn.close()\n",
|
143 |
+
"\n",
|
144 |
+
" try:\n",
|
145 |
+
" json_data = json.loads(data.decode(\"utf-8\"))\n",
|
146 |
+
" jobs_results = json_data['google_jobs_results']\n",
|
147 |
+
" job_columns = ['title', 'company_name', 'location', 'description', 'extensions', 'job_id']\n",
|
148 |
+
" df = pd.DataFrame(jobs_results, columns=job_columns)\n",
|
149 |
+
" return df\n",
|
150 |
+
" except (KeyError, json.JSONDecodeError) as e:\n",
|
151 |
+
" print(f\"Error occurred for search: {job_title} in {city_state}\")\n",
|
152 |
+
" print(f\"Error message: {str(e)}\")\n",
|
153 |
+
" return None\n",
|
154 |
+
"\n",
|
155 |
+
"def sql_dump(df, table):\n",
|
156 |
+
" engine = create_engine(f\"postgresql://{os.getenv('MasterName')}:{os.getenv('MasterPass')}@{os.getenv('RDS_EndPoint')}:5432/postgres\")\n",
|
157 |
+
" with engine.connect() as conn:\n",
|
158 |
+
" df.to_sql(table, conn, if_exists='append', chunksize=20, method='multi', index=False)\n",
|
159 |
+
" print(f\"Dumped {df.shape} to SQL table {table}\")\n",
|
160 |
+
"\n",
|
161 |
+
"def process_batch(job, city_state, start):\n",
|
162 |
+
" df_10jobs = google_job_search(job, city_state, start)\n",
|
163 |
+
" if df_10jobs is not None:\n",
|
164 |
+
" print(f'City: {city_state} Job: {job} Start: {start}')\n",
|
165 |
+
" print(df_10jobs.shape)\n",
|
166 |
+
" date = dt.datetime.today().strftime('%Y-%m-%d')\n",
|
167 |
+
" df_10jobs['retrieve_date'] = date\n",
|
168 |
+
" df_10jobs.drop_duplicates(subset=['job_id', 'company_name'], inplace=True)\n",
|
169 |
+
" rows_affected = sql_dump(df_10jobs, 'usajobs24')\n",
|
170 |
+
" print(f\"Rows affected: {rows_affected}\")\n",
|
171 |
+
"\n",
|
172 |
+
"def main(job_list, city_state_list):\n",
|
173 |
+
" with ThreadPoolExecutor() as executor:\n",
|
174 |
+
" futures = []\n",
|
175 |
+
" for job in job_list:\n",
|
176 |
+
" for city_state in city_state_list:\n",
|
177 |
+
" for start in range(0, 2):\n",
|
178 |
+
" future = executor.submit(process_batch, job, city_state, start)\n",
|
179 |
+
" futures.append(future)\n",
|
180 |
+
"\n",
|
181 |
+
" for future in as_completed(futures):\n",
|
182 |
+
" future.result()"
|
183 |
+
]
|
184 |
+
},
|
185 |
+
{
|
186 |
+
"cell_type": "code",
|
187 |
+
"execution_count": 2,
|
188 |
+
"metadata": {},
|
189 |
+
"outputs": [
|
190 |
+
{
|
191 |
+
"name": "stdout",
|
192 |
+
"output_type": "stream",
|
193 |
+
"text": [
|
194 |
+
"City: Menlo Park CA Job: Data Engineer Start: 1\n",
|
195 |
+
"(10, 6)\n",
|
196 |
+
"City: San Francisco CA Job: Data Engineer Start: 0\n",
|
197 |
+
"(9, 6)\n",
|
198 |
+
"City: Mountain View CA Job: Data Analyst Start: 0\n",
|
199 |
+
"(10, 6)\n",
|
200 |
+
"City: Menlo Park CA Job: Data Analyst Start: 1\n",
|
201 |
+
"(10, 6)\n",
|
202 |
+
"City: Palo Alto CA Job: Data Analyst Start: 1\n",
|
203 |
+
"(10, 6)\n",
|
204 |
+
"City: San Francisco CA Job: Data Analyst Start: 0\n",
|
205 |
+
"(10, 6)\n",
|
206 |
+
"City: Menlo Park CA Job: Data Engineer Start: 0\n",
|
207 |
+
"(10, 6)\n",
|
208 |
+
"City: Palo Alto CA Job: Data Analyst Start: 0\n",
|
209 |
+
"(10, 6)\n",
|
210 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
211 |
+
"Rows affected: None\n",
|
212 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
213 |
+
"Rows affected: None\n",
|
214 |
+
"Dumped (10, 7) to SQL table leoTestTableDumped (9, 7) to SQL table leoTestTable\n",
|
215 |
+
"Rows affected: None\n",
|
216 |
+
"\n",
|
217 |
+
"Rows affected: None\n",
|
218 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
219 |
+
"Rows affected: None\n",
|
220 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
221 |
+
"Rows affected: None\n",
|
222 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
223 |
+
"Rows affected: None\n",
|
224 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
225 |
+
"Rows affected: None\n",
|
226 |
+
"City: Mountain View CA Job: Data Engineer Start: 1\n",
|
227 |
+
"(10, 6)\n",
|
228 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
229 |
+
"Rows affected: None\n",
|
230 |
+
"City: Mountain View CA Job: Big Data Engineer Start: 0\n",
|
231 |
+
"(10, 6)\n",
|
232 |
+
"City: San Francisco CA Job: Big Data Engineer Start: 1\n",
|
233 |
+
"(10, 6)\n",
|
234 |
+
"City: Menlo Park CA Job: Big Data Engineer Start: 1\n",
|
235 |
+
"(10, 6)\n",
|
236 |
+
"City: Menlo Park CA Job: Big Data Engineer Start: 0\n",
|
237 |
+
"(10, 6)\n",
|
238 |
+
"Dumped (10, 7) to SQL table leoTestTableDumped (10, 7) to SQL table leoTestTable\n",
|
239 |
+
"Rows affected: None\n",
|
240 |
+
"\n",
|
241 |
+
"Rows affected: None\n",
|
242 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
243 |
+
"Rows affected: None\n",
|
244 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
245 |
+
"Rows affected: None\n",
|
246 |
+
"City: Palo Alto CA Job: Data Engineer Start: 0\n",
|
247 |
+
"(10, 6)\n",
|
248 |
+
"Error occurred for search: Data Engineer in Palo Alto CA\n",
|
249 |
+
"Error message: 'google_jobs_results'\n",
|
250 |
+
"Error occurred for search: Data Engineer in Mountain View CA\n",
|
251 |
+
"Error message: 'google_jobs_results'\n",
|
252 |
+
"Error occurred for search: Data Analyst in Menlo Park CA\n",
|
253 |
+
"Error message: 'google_jobs_results'\n",
|
254 |
+
"Error occurred for search: Data Analyst in Mountain View CA\n",
|
255 |
+
"Error message: 'google_jobs_results'\n",
|
256 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
257 |
+
"Rows affected: None\n",
|
258 |
+
"Error occurred for search: Data Engineer in San Francisco CA\n",
|
259 |
+
"Error message: 'google_jobs_results'\n",
|
260 |
+
"City: Palo Alto CA Job: Big Data Engineer Start: 1\n",
|
261 |
+
"(10, 6)\n",
|
262 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
263 |
+
"Rows affected: None\n",
|
264 |
+
"Error occurred for search: Big Data Engineer in Mountain View CA\n",
|
265 |
+
"Error message: 'google_jobs_results'\n",
|
266 |
+
"Error occurred for search: Big Data Engineer in San Francisco CA\n",
|
267 |
+
"Error message: 'google_jobs_results'\n",
|
268 |
+
"City: San Francisco CA Job: Data Analyst Start: 1\n",
|
269 |
+
"(10, 6)\n",
|
270 |
+
"Dumped (10, 7) to SQL table leoTestTable\n",
|
271 |
+
"Rows affected: None\n",
|
272 |
+
"Error occurred for search: Big Data Engineer in Palo Alto CA\n",
|
273 |
+
"Error message: 'google_jobs_results'\n"
|
274 |
+
]
|
275 |
+
}
|
276 |
+
],
|
277 |
+
"source": [
|
278 |
+
"job_list = [\"Data Analyst\", \"Data Engineer\", \"Big Data Engineer\"]\n",
|
279 |
+
"simple_city_state_list = [\"Menlo Park CA\", \"Palo Alto CA\", \"San Francisco CA\", \"Mountain View CA\"]\n",
|
280 |
+
"main(job_list, simple_city_state_list)"
|
281 |
+
]
|
282 |
+
},
|
283 |
+
{
|
284 |
+
"cell_type": "markdown",
|
285 |
+
"metadata": {},
|
286 |
+
"source": [
|
287 |
+
"### Great now that we have written some data lets read it."
|
288 |
+
]
|
289 |
+
},
|
290 |
+
{
|
291 |
+
"cell_type": "code",
|
292 |
+
"execution_count": 31,
|
293 |
+
"metadata": {},
|
294 |
+
"outputs": [],
|
295 |
+
"source": [
|
296 |
+
"import pandas as pd\n",
|
297 |
+
"from sqlalchemy import create_engine\n",
|
298 |
+
"\n",
|
299 |
+
"def read_data_from_db(table_name):\n",
|
300 |
+
" engine = create_engine(f\"postgresql://{os.getenv('MasterName')}:{os.getenv('MasterPass')}@{os.getenv('RDS_EndPoint')}:5432/postgres\")\n",
|
301 |
+
" \n",
|
302 |
+
" try:\n",
|
303 |
+
" with engine.connect() as conn:\n",
|
304 |
+
" query = f'SELECT * FROM \"{table_name}\"'\n",
|
305 |
+
" df = pd.read_sql(query, conn)\n",
|
306 |
+
" return df\n",
|
307 |
+
" except Exception as e:\n",
|
308 |
+
" print(f\"Error occurred while reading data from the database: {str(e)}\")\n",
|
309 |
+
" return None\n",
|
310 |
+
"\n",
|
311 |
+
"data24_df = read_data_from_db('usajobstest')"
|
312 |
+
]
|
313 |
+
},
|
314 |
+
{
|
315 |
+
"cell_type": "code",
|
316 |
+
"execution_count": 32,
|
317 |
+
"metadata": {},
|
318 |
+
"outputs": [
|
319 |
+
{
|
320 |
+
"data": {
|
321 |
+
"text/plain": [
|
322 |
+
"(417, 7)"
|
323 |
+
]
|
324 |
+
},
|
325 |
+
"execution_count": 32,
|
326 |
+
"metadata": {},
|
327 |
+
"output_type": "execute_result"
|
328 |
+
}
|
329 |
+
],
|
330 |
+
"source": [
|
331 |
+
"data24_df.shape"
|
332 |
+
]
|
333 |
+
},
|
334 |
+
{
|
335 |
+
"cell_type": "code",
|
336 |
+
"execution_count": 33,
|
337 |
+
"metadata": {},
|
338 |
+
"outputs": [
|
339 |
+
{
|
340 |
+
"data": {
|
341 |
+
"text/html": [
|
342 |
+
"<div>\n",
|
343 |
+
"<style scoped>\n",
|
344 |
+
" .dataframe tbody tr th:only-of-type {\n",
|
345 |
+
" vertical-align: middle;\n",
|
346 |
+
" }\n",
|
347 |
+
"\n",
|
348 |
+
" .dataframe tbody tr th {\n",
|
349 |
+
" vertical-align: top;\n",
|
350 |
+
" }\n",
|
351 |
+
"\n",
|
352 |
+
" .dataframe thead th {\n",
|
353 |
+
" text-align: right;\n",
|
354 |
+
" }\n",
|
355 |
+
"</style>\n",
|
356 |
+
"<table border=\"1\" class=\"dataframe\">\n",
|
357 |
+
" <thead>\n",
|
358 |
+
" <tr style=\"text-align: right;\">\n",
|
359 |
+
" <th></th>\n",
|
360 |
+
" <th>title</th>\n",
|
361 |
+
" <th>company_name</th>\n",
|
362 |
+
" <th>location</th>\n",
|
363 |
+
" <th>description</th>\n",
|
364 |
+
" <th>extensions</th>\n",
|
365 |
+
" <th>job_id</th>\n",
|
366 |
+
" <th>retrieve_date</th>\n",
|
367 |
+
" </tr>\n",
|
368 |
+
" </thead>\n",
|
369 |
+
" <tbody>\n",
|
370 |
+
" <tr>\n",
|
371 |
+
" <th>0</th>\n",
|
372 |
+
" <td>Business Intelligence Analyst</td>\n",
|
373 |
+
" <td>Nuvolum</td>\n",
|
374 |
+
" <td>San Francisco, CA</td>\n",
|
375 |
+
" <td>Nuvolum combines innovative, data-driven strat...</td>\n",
|
376 |
+
" <td>{\"3 days ago\",Full-time,\"No degree mentioned\"}</td>\n",
|
377 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
378 |
+
" <td>2024-05-04</td>\n",
|
379 |
+
" </tr>\n",
|
380 |
+
" <tr>\n",
|
381 |
+
" <th>1</th>\n",
|
382 |
+
" <td>Sr. Strategy and Business Intelligence Analyst</td>\n",
|
383 |
+
" <td>Sunrun</td>\n",
|
384 |
+
" <td>San Francisco, CA (+1 other)</td>\n",
|
385 |
+
" <td>Everything we do at Sunrun is driven by a dete...</td>\n",
|
386 |
+
" <td>{\"12 days ago\",Full-time,\"Health insurance\",\"D...</td>\n",
|
387 |
+
" <td>eyJqb2JfdGl0bGUiOiJTci4gU3RyYXRlZ3kgYW5kIEJ1c2...</td>\n",
|
388 |
+
" <td>2024-05-04</td>\n",
|
389 |
+
" </tr>\n",
|
390 |
+
" <tr>\n",
|
391 |
+
" <th>2</th>\n",
|
392 |
+
" <td>Business Intelligence Analyst</td>\n",
|
393 |
+
" <td>Side</td>\n",
|
394 |
+
" <td>Anywhere</td>\n",
|
395 |
+
" <td>Side, Inc. seeks Business Intelligence Analyst...</td>\n",
|
396 |
+
" <td>{\"11 days ago\",\"151,736–157,000 a year\",\"Work ...</td>\n",
|
397 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
398 |
+
" <td>2024-05-04</td>\n",
|
399 |
+
" </tr>\n",
|
400 |
+
" <tr>\n",
|
401 |
+
" <th>3</th>\n",
|
402 |
+
" <td>Senior Business Intelligence Developer</td>\n",
|
403 |
+
" <td>TekNavigators Staffing</td>\n",
|
404 |
+
" <td>San Francisco, CA</td>\n",
|
405 |
+
" <td>Role: Senior BI Developer\\n\\nLocation: San Fra...</td>\n",
|
406 |
+
" <td>{\"20 hours ago\",Contractor,\"No degree mentioned\"}</td>\n",
|
407 |
+
" <td>eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW...</td>\n",
|
408 |
+
" <td>2024-05-04</td>\n",
|
409 |
+
" </tr>\n",
|
410 |
+
" <tr>\n",
|
411 |
+
" <th>4</th>\n",
|
412 |
+
" <td>Senior Business Intelligence Analyst</td>\n",
|
413 |
+
" <td>FIS Fidelity National Information Services</td>\n",
|
414 |
+
" <td>San Francisco, CA</td>\n",
|
415 |
+
" <td>Position Type : Full time Type Of Hire : Exper...</td>\n",
|
416 |
+
" <td>{\"19 days ago\",Full-time}</td>\n",
|
417 |
+
" <td>eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW...</td>\n",
|
418 |
+
" <td>2024-05-04</td>\n",
|
419 |
+
" </tr>\n",
|
420 |
+
" <tr>\n",
|
421 |
+
" <th>...</th>\n",
|
422 |
+
" <td>...</td>\n",
|
423 |
+
" <td>...</td>\n",
|
424 |
+
" <td>...</td>\n",
|
425 |
+
" <td>...</td>\n",
|
426 |
+
" <td>...</td>\n",
|
427 |
+
" <td>...</td>\n",
|
428 |
+
" <td>...</td>\n",
|
429 |
+
" </tr>\n",
|
430 |
+
" <tr>\n",
|
431 |
+
" <th>412</th>\n",
|
432 |
+
" <td>Business Intelligence Analyst - Diabetes Marke...</td>\n",
|
433 |
+
" <td>Medtronic</td>\n",
|
434 |
+
" <td>Anywhere</td>\n",
|
435 |
+
" <td>Careers that Change Lives\\n\\nWe are looking fo...</td>\n",
|
436 |
+
" <td>{\"10 days ago\",\"Work from home\",Full-time,\"No ...</td>\n",
|
437 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
438 |
+
" <td>2024-05-04</td>\n",
|
439 |
+
" </tr>\n",
|
440 |
+
" <tr>\n",
|
441 |
+
" <th>413</th>\n",
|
442 |
+
" <td>IT Analyst, Business Intelligence/Data Warehou...</td>\n",
|
443 |
+
" <td>Keck Medicine of USC</td>\n",
|
444 |
+
" <td>Alhambra, CA</td>\n",
|
445 |
+
" <td>Actively design and develop ETL solutions that...</td>\n",
|
446 |
+
" <td>{\"13 days ago\",Full-time}</td>\n",
|
447 |
+
" <td>eyJqb2JfdGl0bGUiOiJJVCBBbmFseXN0LCBCdXNpbmVzcy...</td>\n",
|
448 |
+
" <td>2024-05-04</td>\n",
|
449 |
+
" </tr>\n",
|
450 |
+
" <tr>\n",
|
451 |
+
" <th>414</th>\n",
|
452 |
+
" <td>Director, Business Intelligence</td>\n",
|
453 |
+
" <td>Deutsch LA</td>\n",
|
454 |
+
" <td>Los Angeles, CA</td>\n",
|
455 |
+
" <td>DIRECTOR, BUSINESS INTELLIGENCE\\n\\nWe are seek...</td>\n",
|
456 |
+
" <td>{\"3 days ago\",Full-time,\"No degree mentioned\"}</td>\n",
|
457 |
+
" <td>eyJqb2JfdGl0bGUiOiJEaXJlY3RvciwgQnVzaW5lc3MgSW...</td>\n",
|
458 |
+
" <td>2024-05-04</td>\n",
|
459 |
+
" </tr>\n",
|
460 |
+
" <tr>\n",
|
461 |
+
" <th>415</th>\n",
|
462 |
+
" <td>Business Intelligence Programmer 1</td>\n",
|
463 |
+
" <td>U.S. Bank</td>\n",
|
464 |
+
" <td>Los Angeles, CA</td>\n",
|
465 |
+
" <td>At U.S. Bank, we’re on a journey to do our bes...</td>\n",
|
466 |
+
" <td>{\"3 days ago\",Full-time,\"Health insurance\",\"De...</td>\n",
|
467 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
468 |
+
" <td>2024-05-04</td>\n",
|
469 |
+
" </tr>\n",
|
470 |
+
" <tr>\n",
|
471 |
+
" <th>416</th>\n",
|
472 |
+
" <td>Business Intelligence Analyst</td>\n",
|
473 |
+
" <td>BIGO</td>\n",
|
474 |
+
" <td>Los Angeles, CA</td>\n",
|
475 |
+
" <td>Location: 10250 Constellation Blvd., Century C...</td>\n",
|
476 |
+
" <td>{\"1 day ago\",Full-time,\"Health insurance\",\"Den...</td>\n",
|
477 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
478 |
+
" <td>2024-05-04</td>\n",
|
479 |
+
" </tr>\n",
|
480 |
+
" </tbody>\n",
|
481 |
+
"</table>\n",
|
482 |
+
"<p>417 rows × 7 columns</p>\n",
|
483 |
+
"</div>"
|
484 |
+
],
|
485 |
+
"text/plain": [
|
486 |
+
" title \\\n",
|
487 |
+
"0 Business Intelligence Analyst \n",
|
488 |
+
"1 Sr. Strategy and Business Intelligence Analyst \n",
|
489 |
+
"2 Business Intelligence Analyst \n",
|
490 |
+
"3 Senior Business Intelligence Developer \n",
|
491 |
+
"4 Senior Business Intelligence Analyst \n",
|
492 |
+
".. ... \n",
|
493 |
+
"412 Business Intelligence Analyst - Diabetes Marke... \n",
|
494 |
+
"413 IT Analyst, Business Intelligence/Data Warehou... \n",
|
495 |
+
"414 Director, Business Intelligence \n",
|
496 |
+
"415 Business Intelligence Programmer 1 \n",
|
497 |
+
"416 Business Intelligence Analyst \n",
|
498 |
+
"\n",
|
499 |
+
" company_name location \\\n",
|
500 |
+
"0 Nuvolum San Francisco, CA \n",
|
501 |
+
"1 Sunrun San Francisco, CA (+1 other) \n",
|
502 |
+
"2 Side Anywhere \n",
|
503 |
+
"3 TekNavigators Staffing San Francisco, CA \n",
|
504 |
+
"4 FIS Fidelity National Information Services San Francisco, CA \n",
|
505 |
+
".. ... ... \n",
|
506 |
+
"412 Medtronic Anywhere \n",
|
507 |
+
"413 Keck Medicine of USC Alhambra, CA \n",
|
508 |
+
"414 Deutsch LA Los Angeles, CA \n",
|
509 |
+
"415 U.S. Bank Los Angeles, CA \n",
|
510 |
+
"416 BIGO Los Angeles, CA \n",
|
511 |
+
"\n",
|
512 |
+
" description \\\n",
|
513 |
+
"0 Nuvolum combines innovative, data-driven strat... \n",
|
514 |
+
"1 Everything we do at Sunrun is driven by a dete... \n",
|
515 |
+
"2 Side, Inc. seeks Business Intelligence Analyst... \n",
|
516 |
+
"3 Role: Senior BI Developer\\n\\nLocation: San Fra... \n",
|
517 |
+
"4 Position Type : Full time Type Of Hire : Exper... \n",
|
518 |
+
".. ... \n",
|
519 |
+
"412 Careers that Change Lives\\n\\nWe are looking fo... \n",
|
520 |
+
"413 Actively design and develop ETL solutions that... \n",
|
521 |
+
"414 DIRECTOR, BUSINESS INTELLIGENCE\\n\\nWe are seek... \n",
|
522 |
+
"415 At U.S. Bank, we’re on a journey to do our bes... \n",
|
523 |
+
"416 Location: 10250 Constellation Blvd., Century C... \n",
|
524 |
+
"\n",
|
525 |
+
" extensions \\\n",
|
526 |
+
"0 {\"3 days ago\",Full-time,\"No degree mentioned\"} \n",
|
527 |
+
"1 {\"12 days ago\",Full-time,\"Health insurance\",\"D... \n",
|
528 |
+
"2 {\"11 days ago\",\"151,736–157,000 a year\",\"Work ... \n",
|
529 |
+
"3 {\"20 hours ago\",Contractor,\"No degree mentioned\"} \n",
|
530 |
+
"4 {\"19 days ago\",Full-time} \n",
|
531 |
+
".. ... \n",
|
532 |
+
"412 {\"10 days ago\",\"Work from home\",Full-time,\"No ... \n",
|
533 |
+
"413 {\"13 days ago\",Full-time} \n",
|
534 |
+
"414 {\"3 days ago\",Full-time,\"No degree mentioned\"} \n",
|
535 |
+
"415 {\"3 days ago\",Full-time,\"Health insurance\",\"De... \n",
|
536 |
+
"416 {\"1 day ago\",Full-time,\"Health insurance\",\"Den... \n",
|
537 |
+
"\n",
|
538 |
+
" job_id retrieve_date \n",
|
539 |
+
"0 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
540 |
+
"1 eyJqb2JfdGl0bGUiOiJTci4gU3RyYXRlZ3kgYW5kIEJ1c2... 2024-05-04 \n",
|
541 |
+
"2 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
542 |
+
"3 eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW... 2024-05-04 \n",
|
543 |
+
"4 eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW... 2024-05-04 \n",
|
544 |
+
".. ... ... \n",
|
545 |
+
"412 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
546 |
+
"413 eyJqb2JfdGl0bGUiOiJJVCBBbmFseXN0LCBCdXNpbmVzcy... 2024-05-04 \n",
|
547 |
+
"414 eyJqb2JfdGl0bGUiOiJEaXJlY3RvciwgQnVzaW5lc3MgSW... 2024-05-04 \n",
|
548 |
+
"415 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
549 |
+
"416 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
550 |
+
"\n",
|
551 |
+
"[417 rows x 7 columns]"
|
552 |
+
]
|
553 |
+
},
|
554 |
+
"execution_count": 33,
|
555 |
+
"metadata": {},
|
556 |
+
"output_type": "execute_result"
|
557 |
+
}
|
558 |
+
],
|
559 |
+
"source": [
|
560 |
+
"data24_df"
|
561 |
+
]
|
562 |
+
},
|
563 |
+
{
|
564 |
+
"cell_type": "code",
|
565 |
+
"execution_count": 34,
|
566 |
+
"metadata": {},
|
567 |
+
"outputs": [],
|
568 |
+
"source": [
|
569 |
+
"# get the list of unique title, company_name pairs\n",
|
570 |
+
"title_company = data24_df[['title', 'company_name', 'description']].drop_duplicates()"
|
571 |
+
]
|
572 |
+
},
|
573 |
+
{
|
574 |
+
"cell_type": "code",
|
575 |
+
"execution_count": 35,
|
576 |
+
"metadata": {},
|
577 |
+
"outputs": [
|
578 |
+
{
|
579 |
+
"data": {
|
580 |
+
"text/html": [
|
581 |
+
"<div>\n",
|
582 |
+
"<style scoped>\n",
|
583 |
+
" .dataframe tbody tr th:only-of-type {\n",
|
584 |
+
" vertical-align: middle;\n",
|
585 |
+
" }\n",
|
586 |
+
"\n",
|
587 |
+
" .dataframe tbody tr th {\n",
|
588 |
+
" vertical-align: top;\n",
|
589 |
+
" }\n",
|
590 |
+
"\n",
|
591 |
+
" .dataframe thead th {\n",
|
592 |
+
" text-align: right;\n",
|
593 |
+
" }\n",
|
594 |
+
"</style>\n",
|
595 |
+
"<table border=\"1\" class=\"dataframe\">\n",
|
596 |
+
" <thead>\n",
|
597 |
+
" <tr style=\"text-align: right;\">\n",
|
598 |
+
" <th></th>\n",
|
599 |
+
" <th>title</th>\n",
|
600 |
+
" <th>company_name</th>\n",
|
601 |
+
" <th>location</th>\n",
|
602 |
+
" <th>description</th>\n",
|
603 |
+
" <th>extensions</th>\n",
|
604 |
+
" <th>job_id</th>\n",
|
605 |
+
" <th>retrieve_date</th>\n",
|
606 |
+
" </tr>\n",
|
607 |
+
" </thead>\n",
|
608 |
+
" <tbody>\n",
|
609 |
+
" <tr>\n",
|
610 |
+
" <th>0</th>\n",
|
611 |
+
" <td>Business Intelligence Analyst</td>\n",
|
612 |
+
" <td>Nuvolum</td>\n",
|
613 |
+
" <td>San Francisco, CA</td>\n",
|
614 |
+
" <td>Nuvolum combines innovative, data-driven strat...</td>\n",
|
615 |
+
" <td>{\"3 days ago\",Full-time,\"No degree mentioned\"}</td>\n",
|
616 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
617 |
+
" <td>2024-05-04</td>\n",
|
618 |
+
" </tr>\n",
|
619 |
+
" <tr>\n",
|
620 |
+
" <th>1</th>\n",
|
621 |
+
" <td>Sr. Strategy and Business Intelligence Analyst</td>\n",
|
622 |
+
" <td>Sunrun</td>\n",
|
623 |
+
" <td>San Francisco, CA (+1 other)</td>\n",
|
624 |
+
" <td>Everything we do at Sunrun is driven by a dete...</td>\n",
|
625 |
+
" <td>{\"12 days ago\",Full-time,\"Health insurance\",\"D...</td>\n",
|
626 |
+
" <td>eyJqb2JfdGl0bGUiOiJTci4gU3RyYXRlZ3kgYW5kIEJ1c2...</td>\n",
|
627 |
+
" <td>2024-05-04</td>\n",
|
628 |
+
" </tr>\n",
|
629 |
+
" <tr>\n",
|
630 |
+
" <th>2</th>\n",
|
631 |
+
" <td>Business Intelligence Analyst</td>\n",
|
632 |
+
" <td>Side</td>\n",
|
633 |
+
" <td>Anywhere</td>\n",
|
634 |
+
" <td>Side, Inc. seeks Business Intelligence Analyst...</td>\n",
|
635 |
+
" <td>{\"11 days ago\",\"151,736–157,000 a year\",\"Work ...</td>\n",
|
636 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
637 |
+
" <td>2024-05-04</td>\n",
|
638 |
+
" </tr>\n",
|
639 |
+
" <tr>\n",
|
640 |
+
" <th>3</th>\n",
|
641 |
+
" <td>Senior Business Intelligence Developer</td>\n",
|
642 |
+
" <td>TekNavigators Staffing</td>\n",
|
643 |
+
" <td>San Francisco, CA</td>\n",
|
644 |
+
" <td>Role: Senior BI Developer\\n\\nLocation: San Fra...</td>\n",
|
645 |
+
" <td>{\"20 hours ago\",Contractor,\"No degree mentioned\"}</td>\n",
|
646 |
+
" <td>eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW...</td>\n",
|
647 |
+
" <td>2024-05-04</td>\n",
|
648 |
+
" </tr>\n",
|
649 |
+
" <tr>\n",
|
650 |
+
" <th>4</th>\n",
|
651 |
+
" <td>Senior Business Intelligence Analyst</td>\n",
|
652 |
+
" <td>FIS Fidelity National Information Services</td>\n",
|
653 |
+
" <td>San Francisco, CA</td>\n",
|
654 |
+
" <td>Position Type : Full time Type Of Hire : Exper...</td>\n",
|
655 |
+
" <td>{\"19 days ago\",Full-time}</td>\n",
|
656 |
+
" <td>eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW...</td>\n",
|
657 |
+
" <td>2024-05-04</td>\n",
|
658 |
+
" </tr>\n",
|
659 |
+
" <tr>\n",
|
660 |
+
" <th>...</th>\n",
|
661 |
+
" <td>...</td>\n",
|
662 |
+
" <td>...</td>\n",
|
663 |
+
" <td>...</td>\n",
|
664 |
+
" <td>...</td>\n",
|
665 |
+
" <td>...</td>\n",
|
666 |
+
" <td>...</td>\n",
|
667 |
+
" <td>...</td>\n",
|
668 |
+
" </tr>\n",
|
669 |
+
" <tr>\n",
|
670 |
+
" <th>412</th>\n",
|
671 |
+
" <td>Business Intelligence Analyst - Diabetes Marke...</td>\n",
|
672 |
+
" <td>Medtronic</td>\n",
|
673 |
+
" <td>Anywhere</td>\n",
|
674 |
+
" <td>Careers that Change Lives\\n\\nWe are looking fo...</td>\n",
|
675 |
+
" <td>{\"10 days ago\",\"Work from home\",Full-time,\"No ...</td>\n",
|
676 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
677 |
+
" <td>2024-05-04</td>\n",
|
678 |
+
" </tr>\n",
|
679 |
+
" <tr>\n",
|
680 |
+
" <th>413</th>\n",
|
681 |
+
" <td>IT Analyst, Business Intelligence/Data Warehou...</td>\n",
|
682 |
+
" <td>Keck Medicine of USC</td>\n",
|
683 |
+
" <td>Alhambra, CA</td>\n",
|
684 |
+
" <td>Actively design and develop ETL solutions that...</td>\n",
|
685 |
+
" <td>{\"13 days ago\",Full-time}</td>\n",
|
686 |
+
" <td>eyJqb2JfdGl0bGUiOiJJVCBBbmFseXN0LCBCdXNpbmVzcy...</td>\n",
|
687 |
+
" <td>2024-05-04</td>\n",
|
688 |
+
" </tr>\n",
|
689 |
+
" <tr>\n",
|
690 |
+
" <th>414</th>\n",
|
691 |
+
" <td>Director, Business Intelligence</td>\n",
|
692 |
+
" <td>Deutsch LA</td>\n",
|
693 |
+
" <td>Los Angeles, CA</td>\n",
|
694 |
+
" <td>DIRECTOR, BUSINESS INTELLIGENCE\\n\\nWe are seek...</td>\n",
|
695 |
+
" <td>{\"3 days ago\",Full-time,\"No degree mentioned\"}</td>\n",
|
696 |
+
" <td>eyJqb2JfdGl0bGUiOiJEaXJlY3RvciwgQnVzaW5lc3MgSW...</td>\n",
|
697 |
+
" <td>2024-05-04</td>\n",
|
698 |
+
" </tr>\n",
|
699 |
+
" <tr>\n",
|
700 |
+
" <th>415</th>\n",
|
701 |
+
" <td>Business Intelligence Programmer 1</td>\n",
|
702 |
+
" <td>U.S. Bank</td>\n",
|
703 |
+
" <td>Los Angeles, CA</td>\n",
|
704 |
+
" <td>At U.S. Bank, we’re on a journey to do our bes...</td>\n",
|
705 |
+
" <td>{\"3 days ago\",Full-time,\"Health insurance\",\"De...</td>\n",
|
706 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
707 |
+
" <td>2024-05-04</td>\n",
|
708 |
+
" </tr>\n",
|
709 |
+
" <tr>\n",
|
710 |
+
" <th>416</th>\n",
|
711 |
+
" <td>Business Intelligence Analyst</td>\n",
|
712 |
+
" <td>BIGO</td>\n",
|
713 |
+
" <td>Los Angeles, CA</td>\n",
|
714 |
+
" <td>Location: 10250 Constellation Blvd., Century C...</td>\n",
|
715 |
+
" <td>{\"1 day ago\",Full-time,\"Health insurance\",\"Den...</td>\n",
|
716 |
+
" <td>eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2...</td>\n",
|
717 |
+
" <td>2024-05-04</td>\n",
|
718 |
+
" </tr>\n",
|
719 |
+
" </tbody>\n",
|
720 |
+
"</table>\n",
|
721 |
+
"<p>417 rows × 7 columns</p>\n",
|
722 |
+
"</div>"
|
723 |
+
],
|
724 |
+
"text/plain": [
|
725 |
+
" title \\\n",
|
726 |
+
"0 Business Intelligence Analyst \n",
|
727 |
+
"1 Sr. Strategy and Business Intelligence Analyst \n",
|
728 |
+
"2 Business Intelligence Analyst \n",
|
729 |
+
"3 Senior Business Intelligence Developer \n",
|
730 |
+
"4 Senior Business Intelligence Analyst \n",
|
731 |
+
".. ... \n",
|
732 |
+
"412 Business Intelligence Analyst - Diabetes Marke... \n",
|
733 |
+
"413 IT Analyst, Business Intelligence/Data Warehou... \n",
|
734 |
+
"414 Director, Business Intelligence \n",
|
735 |
+
"415 Business Intelligence Programmer 1 \n",
|
736 |
+
"416 Business Intelligence Analyst \n",
|
737 |
+
"\n",
|
738 |
+
" company_name location \\\n",
|
739 |
+
"0 Nuvolum San Francisco, CA \n",
|
740 |
+
"1 Sunrun San Francisco, CA (+1 other) \n",
|
741 |
+
"2 Side Anywhere \n",
|
742 |
+
"3 TekNavigators Staffing San Francisco, CA \n",
|
743 |
+
"4 FIS Fidelity National Information Services San Francisco, CA \n",
|
744 |
+
".. ... ... \n",
|
745 |
+
"412 Medtronic Anywhere \n",
|
746 |
+
"413 Keck Medicine of USC Alhambra, CA \n",
|
747 |
+
"414 Deutsch LA Los Angeles, CA \n",
|
748 |
+
"415 U.S. Bank Los Angeles, CA \n",
|
749 |
+
"416 BIGO Los Angeles, CA \n",
|
750 |
+
"\n",
|
751 |
+
" description \\\n",
|
752 |
+
"0 Nuvolum combines innovative, data-driven strat... \n",
|
753 |
+
"1 Everything we do at Sunrun is driven by a dete... \n",
|
754 |
+
"2 Side, Inc. seeks Business Intelligence Analyst... \n",
|
755 |
+
"3 Role: Senior BI Developer\\n\\nLocation: San Fra... \n",
|
756 |
+
"4 Position Type : Full time Type Of Hire : Exper... \n",
|
757 |
+
".. ... \n",
|
758 |
+
"412 Careers that Change Lives\\n\\nWe are looking fo... \n",
|
759 |
+
"413 Actively design and develop ETL solutions that... \n",
|
760 |
+
"414 DIRECTOR, BUSINESS INTELLIGENCE\\n\\nWe are seek... \n",
|
761 |
+
"415 At U.S. Bank, we’re on a journey to do our bes... \n",
|
762 |
+
"416 Location: 10250 Constellation Blvd., Century C... \n",
|
763 |
+
"\n",
|
764 |
+
" extensions \\\n",
|
765 |
+
"0 {\"3 days ago\",Full-time,\"No degree mentioned\"} \n",
|
766 |
+
"1 {\"12 days ago\",Full-time,\"Health insurance\",\"D... \n",
|
767 |
+
"2 {\"11 days ago\",\"151,736–157,000 a year\",\"Work ... \n",
|
768 |
+
"3 {\"20 hours ago\",Contractor,\"No degree mentioned\"} \n",
|
769 |
+
"4 {\"19 days ago\",Full-time} \n",
|
770 |
+
".. ... \n",
|
771 |
+
"412 {\"10 days ago\",\"Work from home\",Full-time,\"No ... \n",
|
772 |
+
"413 {\"13 days ago\",Full-time} \n",
|
773 |
+
"414 {\"3 days ago\",Full-time,\"No degree mentioned\"} \n",
|
774 |
+
"415 {\"3 days ago\",Full-time,\"Health insurance\",\"De... \n",
|
775 |
+
"416 {\"1 day ago\",Full-time,\"Health insurance\",\"Den... \n",
|
776 |
+
"\n",
|
777 |
+
" job_id retrieve_date \n",
|
778 |
+
"0 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
779 |
+
"1 eyJqb2JfdGl0bGUiOiJTci4gU3RyYXRlZ3kgYW5kIEJ1c2... 2024-05-04 \n",
|
780 |
+
"2 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
781 |
+
"3 eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW... 2024-05-04 \n",
|
782 |
+
"4 eyJqb2JfdGl0bGUiOiJTZW5pb3IgQnVzaW5lc3MgSW50ZW... 2024-05-04 \n",
|
783 |
+
".. ... ... \n",
|
784 |
+
"412 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
785 |
+
"413 eyJqb2JfdGl0bGUiOiJJVCBBbmFseXN0LCBCdXNpbmVzcy... 2024-05-04 \n",
|
786 |
+
"414 eyJqb2JfdGl0bGUiOiJEaXJlY3RvciwgQnVzaW5lc3MgSW... 2024-05-04 \n",
|
787 |
+
"415 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
788 |
+
"416 eyJqb2JfdGl0bGUiOiJCdXNpbmVzcyBJbnRlbGxpZ2VuY2... 2024-05-04 \n",
|
789 |
+
"\n",
|
790 |
+
"[417 rows x 7 columns]"
|
791 |
+
]
|
792 |
+
},
|
793 |
+
"execution_count": 35,
|
794 |
+
"metadata": {},
|
795 |
+
"output_type": "execute_result"
|
796 |
+
}
|
797 |
+
],
|
798 |
+
"source": [
|
799 |
+
"data24_df"
|
800 |
+
]
|
801 |
+
},
|
802 |
+
{
|
803 |
+
"cell_type": "code",
|
804 |
+
"execution_count": 36,
|
805 |
+
"metadata": {},
|
806 |
+
"outputs": [],
|
807 |
+
"source": [
|
808 |
+
"from typing import List, Optional\n",
|
809 |
+
"from langchain_core.pydantic_v1 import BaseModel, Field\n",
|
810 |
+
"\n",
|
811 |
+
"class CompanyOverview(BaseModel):\n",
|
812 |
+
" \"\"\"\n",
|
813 |
+
" A model for capturing key information about the company offering the job.\n",
|
814 |
+
" \n",
|
815 |
+
" Extract relevant details about the company from the job description, \n",
|
816 |
+
" including a brief overview of its industry and products, its mission and \n",
|
817 |
+
" values, size, and location(s).\n",
|
818 |
+
" \n",
|
819 |
+
" Focus on capturing the most salient points that give a well-rounded picture\n",
|
820 |
+
" of the company and its culture.\n",
|
821 |
+
" \"\"\"\n",
|
822 |
+
"\n",
|
823 |
+
" about: Optional[str] = Field(\n",
|
824 |
+
" None, \n",
|
825 |
+
" description=\"\"\"Brief description of the company, its industry, products, services, \n",
|
826 |
+
" and any notable achievements or differentiators\"\"\"\n",
|
827 |
+
" )\n",
|
828 |
+
"\n",
|
829 |
+
" mission_and_values: Optional[str] = Field(\n",
|
830 |
+
" None,\n",
|
831 |
+
" description=\"\"\"Company mission, vision, values, and culture, including commitments \n",
|
832 |
+
" to diversity, inclusion, social responsibility, and work-life balance\"\"\"\n",
|
833 |
+
" )\n",
|
834 |
+
" \n",
|
835 |
+
" size: Optional[str] = Field(\n",
|
836 |
+
" None,\n",
|
837 |
+
" description=\"Details about company size, such as number of employees\")\n",
|
838 |
+
" \n",
|
839 |
+
" locations: Optional[str] = Field(\n",
|
840 |
+
" None,\n",
|
841 |
+
" description=\"\"\"Geographic presence of the company, including headquarters, \n",
|
842 |
+
" offices, and any remote work options\"\"\"\n",
|
843 |
+
" )\n",
|
844 |
+
" \n",
|
845 |
+
" city: Optional[str] = Field(None, description=\"City where the company is located\")\n",
|
846 |
+
" \n",
|
847 |
+
" state: Optional[str] = Field(None, description=\"State where the company is located\")\n",
|
848 |
+
"\n",
|
849 |
+
"\n",
|
850 |
+
"class RoleSummary(BaseModel):\n",
|
851 |
+
" \"\"\"\n",
|
852 |
+
" A model for capturing the key summary points about the job role.\n",
|
853 |
+
" \n",
|
854 |
+
" Extract the essential high-level details about the role from the job description,\n",
|
855 |
+
" such as the job title, the team or department the role belongs to, the role type, \n",
|
856 |
+
" and any remote work options.\n",
|
857 |
+
" \n",
|
858 |
+
" Prioritize information that helps understand the overall scope and positioning \n",
|
859 |
+
" of the role within the company.\n",
|
860 |
+
" \"\"\"\n",
|
861 |
+
" \n",
|
862 |
+
" title: str = Field(..., description=\"Title of the job role\")\n",
|
863 |
+
" \n",
|
864 |
+
" team_or_department: Optional[str] = Field(\n",
|
865 |
+
" None,\n",
|
866 |
+
" description=\"\"\"Team, department, or business unit the role belongs to, \n",
|
867 |
+
" including any collaborations with other teams\"\"\"\n",
|
868 |
+
" )\n",
|
869 |
+
" \n",
|
870 |
+
" role_type: Optional[str] = Field(\n",
|
871 |
+
" None,\n",
|
872 |
+
" description=\"Type of role (full-time, part-time, contract, etc.)\"\n",
|
873 |
+
" )\n",
|
874 |
+
" \n",
|
875 |
+
" remote: Optional[str] = Field(\n",
|
876 |
+
" None,\n",
|
877 |
+
" description=\"Remote work options for the role (full, hybrid, none)\"\n",
|
878 |
+
" )\n",
|
879 |
+
"\n",
|
880 |
+
"class ResponsibilitiesAndQualifications(BaseModel):\n",
|
881 |
+
" \"\"\"\n",
|
882 |
+
" A model for capturing the key responsibilities, requirements, and preferred \n",
|
883 |
+
" qualifications for the job role.\n",
|
884 |
+
"\n",
|
885 |
+
" Extract the essential duties and expectations of the role, the mandatory \n",
|
886 |
+
" educational background and experience required, and any additional skills \n",
|
887 |
+
" or characteristics that are desirable but not strictly necessary.\n",
|
888 |
+
"\n",
|
889 |
+
" The goal is to provide a clear and comprehensive picture of what the role \n",
|
890 |
+
" entails and what qualifications the ideal candidate should possess.\n",
|
891 |
+
" \"\"\"\n",
|
892 |
+
"\n",
|
893 |
+
" responsibilities: List[str] = Field(\n",
|
894 |
+
" description=\"\"\"The core duties, tasks, and expectations of the role, encompassing \n",
|
895 |
+
" areas such as metrics, theories, business understanding, product \n",
|
896 |
+
" direction, systems, leadership, decision making, strategy, and \n",
|
897 |
+
" collaboration, as described in the job description\"\"\"\n",
|
898 |
+
" )\n",
|
899 |
+
"\n",
|
900 |
+
" required_qualifications: List[str] = Field(\n",
|
901 |
+
" description=\"\"\"The essential educational qualifications (e.g., Doctorate, \n",
|
902 |
+
" Master's, Bachelor's degrees in specific fields) and years of \n",
|
903 |
+
" relevant professional experience that are mandatory for the role, \n",
|
904 |
+
" including any alternative acceptable combinations of education \n",
|
905 |
+
" and experience, as specified in the job description\"\"\"\n",
|
906 |
+
" )\n",
|
907 |
+
" \n",
|
908 |
+
" preferred_qualifications: List[str] = Field(\n",
|
909 |
+
" description=\"\"\"Any additional skills, experiences, characteristics, or domain \n",
|
910 |
+
" expertise that are valuable for the role but not absolute \n",
|
911 |
+
" requirements, such as proficiency with specific tools/technologies, \n",
|
912 |
+
" relevant soft skills, problem solving abilities, and industry \n",
|
913 |
+
" knowledge, as mentioned in the job description as preferred or \n",
|
914 |
+
" nice-to-have qualifications\"\"\"\n",
|
915 |
+
" )\n",
|
916 |
+
" \n",
|
917 |
+
"class CompensationAndBenefits(BaseModel):\n",
|
918 |
+
" \"\"\"\n",
|
919 |
+
" A model for capturing the compensation and benefits package for the job role.\n",
|
920 |
+
" \n",
|
921 |
+
" Extract details about the salary or pay range, bonus and equity compensation, \n",
|
922 |
+
" benefits, and perks from the job description.\n",
|
923 |
+
" \n",
|
924 |
+
" Aim to provide a comprehensive view of the total rewards offered for the role,\n",
|
925 |
+
" including both monetary compensation and non-monetary benefits and perks.\n",
|
926 |
+
" \"\"\"\n",
|
927 |
+
" \n",
|
928 |
+
" salary_or_pay_range: Optional[str] = Field(\n",
|
929 |
+
" None,\n",
|
930 |
+
" description=\"\"\"The salary range or hourly pay range for the role, including \n",
|
931 |
+
" any specific numbers or bands mentioned in the job description\"\"\"\n",
|
932 |
+
" )\n",
|
933 |
+
" \n",
|
934 |
+
" bonus_and_equity: Optional[str] = Field(\n",
|
935 |
+
" None,\n",
|
936 |
+
" description=\"\"\"Any information about bonus compensation, such as signing bonuses, \n",
|
937 |
+
" annual performance bonuses, or other incentives, as well as details \n",
|
938 |
+
" about equity compensation like stock options or RSUs\"\"\"\n",
|
939 |
+
" )\n",
|
940 |
+
" \n",
|
941 |
+
" benefits: Optional[List[str]] = Field(\n",
|
942 |
+
" None,\n",
|
943 |
+
" description=\"\"\"A list of benefits offered for the role, such as health insurance, \n",
|
944 |
+
" dental and vision coverage, retirement plans (401k, pension), paid \n",
|
945 |
+
" time off (vacation, sick days, holidays), parental leave, and any \n",
|
946 |
+
" other standard benefits mentioned in the job description\"\"\"\n",
|
947 |
+
" )\n",
|
948 |
+
" \n",
|
949 |
+
" perks: Optional[List[str]] = Field(\n",
|
950 |
+
" None,\n",
|
951 |
+
" description=\"\"\"A list of additional perks and amenities offered, such as free food \n",
|
952 |
+
" or snacks, commuter benefits, wellness programs, learning and development \n",
|
953 |
+
" stipends, employee discounts, or any other unique perks the company \n",
|
954 |
+
" provides to its employees, as mentioned in the job description\"\"\"\n",
|
955 |
+
" )\n",
|
956 |
+
"\n",
|
957 |
+
"class JobDescription(BaseModel):\n",
|
958 |
+
" \"\"\"Extracted information from a job description.\"\"\"\n",
|
959 |
+
" company_overview: CompanyOverview\n",
|
960 |
+
" role_summary: RoleSummary\n",
|
961 |
+
" responsibilities_and_qualifications: ResponsibilitiesAndQualifications\n",
|
962 |
+
" compensation_and_benefits: CompensationAndBenefits"
|
963 |
+
]
|
964 |
+
},
|
965 |
+
{
|
966 |
+
"cell_type": "code",
|
967 |
+
"execution_count": 37,
|
968 |
+
"metadata": {},
|
969 |
+
"outputs": [
|
970 |
+
{
|
971 |
+
"data": {
|
972 |
+
"text/plain": [
|
973 |
+
"True"
|
974 |
+
]
|
975 |
+
},
|
976 |
+
"execution_count": 37,
|
977 |
+
"metadata": {},
|
978 |
+
"output_type": "execute_result"
|
979 |
+
}
|
980 |
+
],
|
981 |
+
"source": [
|
982 |
+
"from typing import List, Optional\n",
|
983 |
+
"\n",
|
984 |
+
"from langchain.chains import create_structured_output_runnable\n",
|
985 |
+
"from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder\n",
|
986 |
+
"from langchain_core.pydantic_v1 import BaseModel, Field\n",
|
987 |
+
"\n",
|
988 |
+
"from langchain_groq import ChatGroq\n",
|
989 |
+
"from dotenv import load_dotenv\n",
|
990 |
+
"import os\n",
|
991 |
+
"\n",
|
992 |
+
"load_dotenv()"
|
993 |
+
]
|
994 |
+
},
|
995 |
+
{
|
996 |
+
"cell_type": "code",
|
997 |
+
"execution_count": 38,
|
998 |
+
"metadata": {},
|
999 |
+
"outputs": [],
|
1000 |
+
"source": [
|
1001 |
+
"prompt = ChatPromptTemplate.from_messages(\n",
|
1002 |
+
" [\n",
|
1003 |
+
" (\n",
|
1004 |
+
" \"system\",\n",
|
1005 |
+
" \"\"\"You are an expert at identifying key aspects of job descriptions. Your task is to extract important information from a raw job description and organize it into a structured format using the ResponsibilitiesAndQualifications class.\n",
|
1006 |
+
"\n",
|
1007 |
+
" When parsing the job description, your goal is to capture as much relevant information as possible in the appropriate fields of the class. This includes:\n",
|
1008 |
+
"\n",
|
1009 |
+
" 1. All key responsibilities and duties of the role, covering the full range of tasks and expectations.\n",
|
1010 |
+
" 2. The required educational qualifications and years of experience, including different acceptable combinations.\n",
|
1011 |
+
" 3. Any additional preferred skills, experiences, and characteristics that are desirable for the role.\n",
|
1012 |
+
"\n",
|
1013 |
+
" Avoid summarizing or paraphrasing the information. Instead, extract the details as closely as possible to how they appear in the original job description. The aim is to organize and structure the raw data, not to condense or interpret it.\n",
|
1014 |
+
"\n",
|
1015 |
+
" Some specific things to look out for:\n",
|
1016 |
+
" - Responsibilities related to metrics, theories, business understanding, product direction, systems, leadership, decision making, strategy, and collaboration\n",
|
1017 |
+
" - Required degrees (Doctorate, Master's, Bachelor's) in relevant fields, along with the corresponding years of experience\n",
|
1018 |
+
" - Preferred qualifications like years of coding experience, soft skills, problem solving abilities, and domain expertise\n",
|
1019 |
+
"\n",
|
1020 |
+
" If any of these details are missing from the job description, simply omit them from the output rather than trying to infer or fill in the gaps.\n",
|
1021 |
+
"\n",
|
1022 |
+
" The structured data you extract will be used for further analysis and insights downstream, so err on the side of including more information rather than less. The key is to make the unstructured job description data more organized and manageable while still retaining all the important details.\n",
|
1023 |
+
" \"\"\",\n",
|
1024 |
+
" ),\n",
|
1025 |
+
" # MessagesPlaceholder('examples'), # Keep on reading through this use case to see how to use examples to improve performance\n",
|
1026 |
+
" (\"human\", \"{text}\"),\n",
|
1027 |
+
" ]\n",
|
1028 |
+
")"
|
1029 |
+
]
|
1030 |
+
},
|
1031 |
+
{
|
1032 |
+
"cell_type": "code",
|
1033 |
+
"execution_count": 39,
|
1034 |
+
"metadata": {},
|
1035 |
+
"outputs": [],
|
1036 |
+
"source": [
|
1037 |
+
"from langchain_community.chat_models import ChatPerplexity"
|
1038 |
+
]
|
1039 |
+
},
|
1040 |
+
{
|
1041 |
+
"cell_type": "code",
|
1042 |
+
"execution_count": null,
|
1043 |
+
"metadata": {},
|
1044 |
+
"outputs": [],
|
1045 |
+
"source": []
|
1046 |
+
},
|
1047 |
+
{
|
1048 |
+
"cell_type": "code",
|
1049 |
+
"execution_count": 43,
|
1050 |
+
"metadata": {},
|
1051 |
+
"outputs": [],
|
1052 |
+
"source": [
|
1053 |
+
"llm = ChatGroq(model_name=\"llama3-70b-8192\")\n",
|
1054 |
+
"# llm = ChatPerplexity(pplx_api_key=os.getenv('PPLX_API_KEY'), model='llama-3-70b-instruct')"
|
1055 |
+
]
|
1056 |
+
},
|
1057 |
+
{
|
1058 |
+
"cell_type": "code",
|
1059 |
+
"execution_count": 44,
|
1060 |
+
"metadata": {},
|
1061 |
+
"outputs": [
|
1062 |
+
{
|
1063 |
+
"name": "stderr",
|
1064 |
+
"output_type": "stream",
|
1065 |
+
"text": [
|
1066 |
+
"/Users/leowalker/anaconda3/envs/datajobs/lib/python3.11/site-packages/langchain_core/_api/beta_decorator.py:87: LangChainBetaWarning: The method `ChatGroq.with_structured_output` is in beta. It is actively being worked on, so the API may change.\n",
|
1067 |
+
" warn_beta(\n"
|
1068 |
+
]
|
1069 |
+
}
|
1070 |
+
],
|
1071 |
+
"source": [
|
1072 |
+
"extractor = prompt | llm.with_structured_output(\n",
|
1073 |
+
" schema=JobDescription,\n",
|
1074 |
+
" method=\"function_calling\",\n",
|
1075 |
+
" include_raw=False,\n",
|
1076 |
+
")"
|
1077 |
+
]
|
1078 |
+
},
|
1079 |
+
{
|
1080 |
+
"cell_type": "code",
|
1081 |
+
"execution_count": 42,
|
1082 |
+
"metadata": {},
|
1083 |
+
"outputs": [],
|
1084 |
+
"source": [
|
1085 |
+
"test_description = title_company['description'][2]"
|
1086 |
+
]
|
1087 |
+
},
|
1088 |
+
{
|
1089 |
+
"cell_type": "code",
|
1090 |
+
"execution_count": 156,
|
1091 |
+
"metadata": {},
|
1092 |
+
"outputs": [
|
1093 |
+
{
|
1094 |
+
"data": {
|
1095 |
+
"text/plain": [
|
1096 |
+
"\"The Search + Distribution (S+D) team is the leading applied artificial intelligence team at Microsoft responsible for delivering the quality experience to over 500M+ monthly active users around the world in Microsoft’s search engine, Bing. Our responsibilities include delivering competitive search results, differentiated experiences, and product and business growth. We are constantly applying the... latest state of the art AI technologies to our product and also transferring this technology to other groups across the company.\\n\\nWe sre seeking experienced data scientist to solve cutting-edge metrics and measurement problems in the space of Search, and lead cross-team initiatives. We believe metrics play a key role in executing on the strategy for building the final product.\\n\\nA critical part of the role is to advance our A/B experimentation capabilities for Bing and Microsoft Copilot by introducing advanced, powerful functionality at very large scale to eventually increase experimenter agility and depth of insights, and reduce infrastructure cost through smart design of data structures and computation methods. The role requires not only skills in data science, but also knowledge in data engineering and systems.\\n\\nYou will work closely with multiple teams across S+D and beyond to build a measurement strategy and roadmap towards measuring how relevant, fresh, and authoritative our results are while being strategically differentiated from our biggest competitors. We expect you to work with Microsoft Research and the rest of academia to unravel complex problems in our products and push the limits of what AI can do for our customers. The world needs credible alternatives to find authoritative information on the web, so there is social responsibility.\\n\\nThis Principal Data Scientist position is a very strategic position part of the S+D Bing Metrics and Analytics team. S+D itself is part of the broader Windows and Web Experiences Team (WWE) and this position will collaborate with and influence other data science and metrics groups in WWE such as Edge, MS Start, Maps, Bing Ads and more. If you are passionate about working on the latest and hottest areas that will help you develop skills in Artificial Intelligence, Machine Learning, data science, scale systems, UX, and product growth, this is the team you’re looking for!\\n\\nMicrosoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. In alignment with our Microsoft values, we are committed to cultivating an inclusive work environment for all employees to positively impact our culture every day.\\n\\nResponsibilities\\n• Define, invent, and deliver online and offline behavioral and human labeled metrics which accurately measure the satisfaction and success of our customers interacting with Search.\\n• Apply behavioral game theory and social science understanding to get the quality work out of crowd workers from around the world\\n• Develop deep understanding of business metrics such as daily active users, query share, click share and query volume across all the relevant entry points\\n• Influence the product and business direction through metrics analyses\\n• Define and build systems and policies to ensure quality, stable, and performant code\\n• Lead a team through analysis, design and code review that guarantee analysis and code quality and allow more junior members to learn and grow their expertise while helping the team build an inclusive interdisciplinary culture where everyone can do their best work\\n• Make independent decisions for the team and handle difficult tradeoffs\\n• Translate strategy into plans that are clear and measurable, with progress shared out monthly to stakeholders\\n• Partner effectively with program management, engineers, finance, marketing, exec management, and other areas of the business\\n\\nQualifications\\n\\nRequired Qualifications:\\n• Doctorate in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ year(s) data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results)\\n• OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 7+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results)\\n• OR Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 10+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results)\\n• OR equivalent experience.\\n\\nPreferred Qualifications:\\n• 6+ years of experience coding in Python, C++, C#, C or Java.\\n• Customer focused, strategic, drives for results, is self-motivated, and has a propensity for action.\\n• Organizational, analytical, data science skills and intuition.\\n• Problem solver: ability to solve problems that the world has not solved before\\n• Interpersonal skills: cross-group and cross-culture collaboration.\\n• Experience with real world system building and data collection, including design, coding and evaluation.\\n\\nData Science IC5 - The typical base pay range for this role across the U.S. is USD $133,600 - $256,800 per year. There is a different range applicable to specific work locations, within the San Francisco Bay area and New York City metropolitan area, and the base pay range for this role in those locations is USD $173,200 - $282,200 per year.\\n\\nCertain roles may be eligible for benefits and other compensation. Find additional benefits and pay information here: https://careers.microsoft.com/us/en/us-corporate-pay\\n\\n#WWE# #SearchDistribution# #Bing#\\n\\nMicrosoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations\""
|
1097 |
+
]
|
1098 |
+
},
|
1099 |
+
"execution_count": 156,
|
1100 |
+
"metadata": {},
|
1101 |
+
"output_type": "execute_result"
|
1102 |
+
}
|
1103 |
+
],
|
1104 |
+
"source": [
|
1105 |
+
"test_description"
|
1106 |
+
]
|
1107 |
+
},
|
1108 |
+
{
|
1109 |
+
"cell_type": "code",
|
1110 |
+
"execution_count": 157,
|
1111 |
+
"metadata": {},
|
1112 |
+
"outputs": [],
|
1113 |
+
"source": [
|
1114 |
+
"jobdesc = extractor.invoke(test_description)"
|
1115 |
+
]
|
1116 |
+
},
|
1117 |
+
{
|
1118 |
+
"cell_type": "code",
|
1119 |
+
"execution_count": 158,
|
1120 |
+
"metadata": {},
|
1121 |
+
"outputs": [
|
1122 |
+
{
|
1123 |
+
"name": "stdout",
|
1124 |
+
"output_type": "stream",
|
1125 |
+
"text": [
|
1126 |
+
"{\n",
|
1127 |
+
" \"company_overview\": {\n",
|
1128 |
+
" \"about\": \"Microsoft is a leading technology company responsible for delivering the quality experience to over 500M+ monthly active users around the world in Microsoft\\u2019s search engine, Bing.\",\n",
|
1129 |
+
" \"mission_and_values\": \"Empower every person and every organization on the planet to achieve more.\",\n",
|
1130 |
+
" \"size\": \"500M+ users\",\n",
|
1131 |
+
" \"locations\": \"Global\",\n",
|
1132 |
+
" \"city\": \"Redmond\",\n",
|
1133 |
+
" \"state\": null\n",
|
1134 |
+
" },\n",
|
1135 |
+
" \"role_summary\": {\n",
|
1136 |
+
" \"title\": \"Principal Data Scientist\",\n",
|
1137 |
+
" \"team_or_department\": \"Search + Distribution (S+D) team\",\n",
|
1138 |
+
" \"role_type\": \"Full-time\",\n",
|
1139 |
+
" \"remote\": \"N/A\"\n",
|
1140 |
+
" },\n",
|
1141 |
+
" \"responsibilities_and_qualifications\": {\n",
|
1142 |
+
" \"responsibilities\": [\n",
|
1143 |
+
" \"Define, invent, and deliver online and offline behavioral and human labeled metrics which accurately measure the satisfaction and success of our customers interacting with Search.\",\n",
|
1144 |
+
" \"Apply behavioral game theory and social science understanding to get the quality work out of crowd workers from around the world\",\n",
|
1145 |
+
" \"Develop deep understanding of business metrics such as daily active users, query share, click share and query volume across all the relevant entry points\",\n",
|
1146 |
+
" \"Influence the product and business direction through metrics analyses\",\n",
|
1147 |
+
" \"Define and build systems and policies to ensure quality, stable, and performant code\"\n",
|
1148 |
+
" ],\n",
|
1149 |
+
" \"required_qualifications\": [\n",
|
1150 |
+
" \"Doctorate in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ year(s) data-science experience.\",\n",
|
1151 |
+
" \"OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 7+ years data-science experience.\",\n",
|
1152 |
+
" \"OR Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 10+ years data-science experience.\",\n",
|
1153 |
+
" \"OR equivalent experience.\"\n",
|
1154 |
+
" ],\n",
|
1155 |
+
" \"preferred_qualifications\": [\n",
|
1156 |
+
" \"6+ years of experience coding in Python, C++, C#, C or Java.\",\n",
|
1157 |
+
" \"Customer focused, strategic, drives for results, is self-motivated, and has a propensity for action.\",\n",
|
1158 |
+
" \"Organizational, analytical, data science skills and intuition.\",\n",
|
1159 |
+
" \"Problem solver: ability to solve problems that the world has not solved before\",\n",
|
1160 |
+
" \"Interpersonal skills: cross-group and cross-culture collaboration.\",\n",
|
1161 |
+
" \"Experience with real world system building and data collection, including design, coding and evaluation.\"\n",
|
1162 |
+
" ]\n",
|
1163 |
+
" },\n",
|
1164 |
+
" \"compensation_and_benefits\": {\n",
|
1165 |
+
" \"salary_or_pay_range\": \"USD $133,600 - $256,800 per year\",\n",
|
1166 |
+
" \"bonus_and_equity\": \"Competitive compensation package\",\n",
|
1167 |
+
" \"benefits\": [\n",
|
1168 |
+
" \"health insurance\",\n",
|
1169 |
+
" \"dental and vision coverage\",\n",
|
1170 |
+
" \"retirement plans (401k, pension)\",\n",
|
1171 |
+
" \"paid time off (vacation, sick days, holidays)\",\n",
|
1172 |
+
" \"parental leave\"\n",
|
1173 |
+
" ],\n",
|
1174 |
+
" \"perks\": [\n",
|
1175 |
+
" \"free food or snacks\",\n",
|
1176 |
+
" \"commuter benefits\",\n",
|
1177 |
+
" \"wellness programs\",\n",
|
1178 |
+
" \"learning and development stipends\",\n",
|
1179 |
+
" \"employee discounts\"\n",
|
1180 |
+
" ]\n",
|
1181 |
+
" }\n",
|
1182 |
+
"}\n"
|
1183 |
+
]
|
1184 |
+
},
|
1185 |
+
{
|
1186 |
+
"data": {
|
1187 |
+
"text/plain": [
|
1188 |
+
"JobDescription(company_overview=CompanyOverview(about='Microsoft is a leading technology company responsible for delivering the quality experience to over 500M+ monthly active users around the world in Microsoft’s search engine, Bing.', mission_and_values='Empower every person and every organization on the planet to achieve more.', size='500M+ users', locations='Global', city='Redmond', state=None), role_summary=RoleSummary(title='Principal Data Scientist', team_or_department='Search + Distribution (S+D) team', role_type='Full-time', remote='N/A'), responsibilities_and_qualifications=ResponsibilitiesAndQualifications(responsibilities=['Define, invent, and deliver online and offline behavioral and human labeled metrics which accurately measure the satisfaction and success of our customers interacting with Search.', 'Apply behavioral game theory and social science understanding to get the quality work out of crowd workers from around the world', 'Develop deep understanding of business metrics such as daily active users, query share, click share and query volume across all the relevant entry points', 'Influence the product and business direction through metrics analyses', 'Define and build systems and policies to ensure quality, stable, and performant code'], required_qualifications=['Doctorate in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ year(s) data-science experience.', \"OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 7+ years data-science experience.\", \"OR Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 10+ years data-science experience.\", 'OR equivalent experience.'], preferred_qualifications=['6+ years of experience coding in Python, C++, C#, C or Java.', 'Customer focused, strategic, drives for results, is self-motivated, and has a propensity for action.', 'Organizational, analytical, data science skills and intuition.', 'Problem solver: ability to solve problems that the world has not solved before', 'Interpersonal skills: cross-group and cross-culture collaboration.', 'Experience with real world system building and data collection, including design, coding and evaluation.']), compensation_and_benefits=CompensationAndBenefits(salary_or_pay_range='USD $133,600 - $256,800 per year', bonus_and_equity='Competitive compensation package', benefits=['health insurance', 'dental and vision coverage', 'retirement plans (401k, pension)', 'paid time off (vacation, sick days, holidays)', 'parental leave'], perks=['free food or snacks', 'commuter benefits', 'wellness programs', 'learning and development stipends', 'employee discounts']))"
|
1189 |
+
]
|
1190 |
+
},
|
1191 |
+
"execution_count": 158,
|
1192 |
+
"metadata": {},
|
1193 |
+
"output_type": "execute_result"
|
1194 |
+
}
|
1195 |
+
],
|
1196 |
+
"source": [
|
1197 |
+
"import json\n",
|
1198 |
+
"\n",
|
1199 |
+
"def pretty_print_pydantic(obj):\n",
|
1200 |
+
" print(json.dumps(obj.dict(), indent=4))\n",
|
1201 |
+
"\n",
|
1202 |
+
"# Example usage\n",
|
1203 |
+
"pretty_print_pydantic(jobdesc)\n",
|
1204 |
+
"jobdesc"
|
1205 |
+
]
|
1206 |
+
},
|
1207 |
+
{
|
1208 |
+
"cell_type": "code",
|
1209 |
+
"execution_count": 95,
|
1210 |
+
"metadata": {},
|
1211 |
+
"outputs": [
|
1212 |
+
{
|
1213 |
+
"name": "stdout",
|
1214 |
+
"output_type": "stream",
|
1215 |
+
"text": [
|
1216 |
+
"('The Search + Distribution (S+D) team is the leading applied artificial '\n",
|
1217 |
+
" 'intelligence team at Microsoft responsible for delivering the quality '\n",
|
1218 |
+
" 'experience to over 500M+ monthly active users around the world in '\n",
|
1219 |
+
" 'Microsoft’s search engine, Bing. Our responsibilities include delivering '\n",
|
1220 |
+
" 'competitive search results, differentiated experiences, and product and '\n",
|
1221 |
+
" 'business growth. We are constantly applying the... latest state of the art '\n",
|
1222 |
+
" 'AI technologies to our product and also transferring this technology to '\n",
|
1223 |
+
" 'other groups across the company.\\n'\n",
|
1224 |
+
" '\\n'\n",
|
1225 |
+
" 'We sre seeking experienced data scientist to solve cutting-edge metrics and '\n",
|
1226 |
+
" 'measurement problems in the space of Search, and lead cross-team '\n",
|
1227 |
+
" 'initiatives. We believe metrics play a key role in executing on the strategy '\n",
|
1228 |
+
" 'for building the final product.\\n'\n",
|
1229 |
+
" '\\n'\n",
|
1230 |
+
" 'A critical part of the role is to advance our A/B experimentation '\n",
|
1231 |
+
" 'capabilities for Bing and Microsoft Copilot by introducing advanced, '\n",
|
1232 |
+
" 'powerful functionality at very large scale to eventually increase '\n",
|
1233 |
+
" 'experimenter agility and depth of insights, and reduce infrastructure cost '\n",
|
1234 |
+
" 'through smart design of data structures and computation methods. The role '\n",
|
1235 |
+
" 'requires not only skills in data science, but also knowledge in data '\n",
|
1236 |
+
" 'engineering and systems.\\n'\n",
|
1237 |
+
" '\\n'\n",
|
1238 |
+
" 'You will work closely with multiple teams across S+D and beyond to build a '\n",
|
1239 |
+
" 'measurement strategy and roadmap towards measuring how relevant, fresh, and '\n",
|
1240 |
+
" 'authoritative our results are while being strategically differentiated from '\n",
|
1241 |
+
" 'our biggest competitors. We expect you to work with Microsoft Research and '\n",
|
1242 |
+
" 'the rest of academia to unravel complex problems in our products and push '\n",
|
1243 |
+
" 'the limits of what AI can do for our customers. The world needs credible '\n",
|
1244 |
+
" 'alternatives to find authoritative information on the web, so there is '\n",
|
1245 |
+
" 'social responsibility.\\n'\n",
|
1246 |
+
" '\\n'\n",
|
1247 |
+
" 'This Principal Data Scientist position is a very strategic position part of '\n",
|
1248 |
+
" 'the S+D Bing Metrics and Analytics team. S+D itself is part of the broader '\n",
|
1249 |
+
" 'Windows and Web Experiences Team (WWE) and this position will collaborate '\n",
|
1250 |
+
" 'with and influence other data science and metrics groups in WWE such as '\n",
|
1251 |
+
" 'Edge, MS Start, Maps, Bing Ads and more. If you are passionate about working '\n",
|
1252 |
+
" 'on the latest and hottest areas that will help you develop skills in '\n",
|
1253 |
+
" 'Artificial Intelligence, Machine Learning, data science, scale systems, UX, '\n",
|
1254 |
+
" 'and product growth, this is the team you’re looking for!\\n'\n",
|
1255 |
+
" '\\n'\n",
|
1256 |
+
" 'Microsoft’s mission is to empower every person and every organization on the '\n",
|
1257 |
+
" 'planet to achieve more. As employees we come together with a growth mindset, '\n",
|
1258 |
+
" 'innovate to empower others, and collaborate to realize our shared goals. '\n",
|
1259 |
+
" 'Each day we build on our values of respect, integrity, and accountability to '\n",
|
1260 |
+
" 'create a culture of inclusion where everyone can thrive at work and beyond. '\n",
|
1261 |
+
" 'In alignment with our Microsoft values, we are committed to cultivating an '\n",
|
1262 |
+
" 'inclusive work environment for all employees to positively impact our '\n",
|
1263 |
+
" 'culture every day.\\n'\n",
|
1264 |
+
" '\\n'\n",
|
1265 |
+
" 'Responsibilities\\n'\n",
|
1266 |
+
" '• Define, invent, and deliver online and offline behavioral and human '\n",
|
1267 |
+
" 'labeled metrics which accurately measure the satisfaction and success of our '\n",
|
1268 |
+
" 'customers interacting with Search.\\n'\n",
|
1269 |
+
" '• Apply behavioral game theory and social science understanding to get the '\n",
|
1270 |
+
" 'quality work out of crowd workers from around the world\\n'\n",
|
1271 |
+
" '• Develop deep understanding of business metrics such as daily active users, '\n",
|
1272 |
+
" 'query share, click share and query volume across all the relevant entry '\n",
|
1273 |
+
" 'points\\n'\n",
|
1274 |
+
" '• Influence the product and business direction through metrics analyses\\n'\n",
|
1275 |
+
" '• Define and build systems and policies to ensure quality, stable, and '\n",
|
1276 |
+
" 'performant code\\n'\n",
|
1277 |
+
" '• Lead a team through analysis, design and code review that guarantee '\n",
|
1278 |
+
" 'analysis and code quality and allow more junior members to learn and grow '\n",
|
1279 |
+
" 'their expertise while helping the team build an inclusive interdisciplinary '\n",
|
1280 |
+
" 'culture where everyone can do their best work\\n'\n",
|
1281 |
+
" '• Make independent decisions for the team and handle difficult tradeoffs\\n'\n",
|
1282 |
+
" '• Translate strategy into plans that are clear and measurable, with progress '\n",
|
1283 |
+
" 'shared out monthly to stakeholders\\n'\n",
|
1284 |
+
" '• Partner effectively with program management, engineers, finance, '\n",
|
1285 |
+
" 'marketing, exec management, and other areas of the business\\n'\n",
|
1286 |
+
" '\\n'\n",
|
1287 |
+
" 'Qualifications\\n'\n",
|
1288 |
+
" '\\n'\n",
|
1289 |
+
" 'Required Qualifications:\\n'\n",
|
1290 |
+
" '• Doctorate in Data Science, Mathematics, Statistics, Econometrics, '\n",
|
1291 |
+
" 'Economics, Operations Research, Computer Science, or related field AND 5+ '\n",
|
1292 |
+
" 'year(s) data-science experience (e.g., managing structured and unstructured '\n",
|
1293 |
+
" 'data, applying statistical techniques and reporting results)\\n'\n",
|
1294 |
+
" \"• OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, \"\n",
|
1295 |
+
" 'Economics, Operations Research, Computer Science, or related field AND 7+ '\n",
|
1296 |
+
" 'years data-science experience (e.g., managing structured and unstructured '\n",
|
1297 |
+
" 'data, applying statistical techniques and reporting results)\\n'\n",
|
1298 |
+
" \"• OR Bachelor's Degree in Data Science, Mathematics, Statistics, \"\n",
|
1299 |
+
" 'Econometrics, Economics, Operations Research, Computer Science, or related '\n",
|
1300 |
+
" 'field AND 10+ years data-science experience (e.g., managing structured and '\n",
|
1301 |
+
" 'unstructured data, applying statistical techniques and reporting results)\\n'\n",
|
1302 |
+
" '• OR equivalent experience.\\n'\n",
|
1303 |
+
" '\\n'\n",
|
1304 |
+
" 'Preferred Qualifications:\\n'\n",
|
1305 |
+
" '• 6+ years of experience coding in Python, C++, C#, C or Java.\\n'\n",
|
1306 |
+
" '• Customer focused, strategic, drives for results, is self-motivated, and '\n",
|
1307 |
+
" 'has a propensity for action.\\n'\n",
|
1308 |
+
" '• Organizational, analytical, data science skills and intuition.\\n'\n",
|
1309 |
+
" '• Problem solver: ability to solve problems that the world has not solved '\n",
|
1310 |
+
" 'before\\n'\n",
|
1311 |
+
" '• Interpersonal skills: cross-group and cross-culture collaboration.\\n'\n",
|
1312 |
+
" '• Experience with real world system building and data collection, including '\n",
|
1313 |
+
" 'design, coding and evaluation.\\n'\n",
|
1314 |
+
" '\\n'\n",
|
1315 |
+
" 'Data Science IC5 - The typical base pay range for this role across the U.S. '\n",
|
1316 |
+
" 'is USD $133,600 - $256,800 per year. There is a different range applicable '\n",
|
1317 |
+
" 'to specific work locations, within the San Francisco Bay area and New York '\n",
|
1318 |
+
" 'City metropolitan area, and the base pay range for this role in those '\n",
|
1319 |
+
" 'locations is USD $173,200 - $282,200 per year.\\n'\n",
|
1320 |
+
" '\\n'\n",
|
1321 |
+
" 'Certain roles may be eligible for benefits and other compensation. Find '\n",
|
1322 |
+
" 'additional benefits and pay information here: '\n",
|
1323 |
+
" 'https://careers.microsoft.com/us/en/us-corporate-pay\\n'\n",
|
1324 |
+
" '\\n'\n",
|
1325 |
+
" '#WWE# #SearchDistribution# #Bing#\\n'\n",
|
1326 |
+
" '\\n'\n",
|
1327 |
+
" 'Microsoft is an equal opportunity employer. Consistent with applicable law, '\n",
|
1328 |
+
" 'all qualified applicants will receive consideration for employment without '\n",
|
1329 |
+
" 'regard to age, ancestry, citizenship, color, family or medical care leave, '\n",
|
1330 |
+
" 'gender identity or expression, genetic information, immigration status, '\n",
|
1331 |
+
" 'marital status, medical condition, national origin, physical or mental '\n",
|
1332 |
+
" 'disability, political affiliation, protected veteran or military status, '\n",
|
1333 |
+
" 'race, ethnicity, religion, sex (including pregnancy), sexual orientation, or '\n",
|
1334 |
+
" 'any other characteristic protected by applicable local laws, regulations and '\n",
|
1335 |
+
" 'ordinances. If you need assistance and/or a reasonable accommodation due to '\n",
|
1336 |
+
" 'a disability during the application process, read more about requesting '\n",
|
1337 |
+
" 'accommodations')\n"
|
1338 |
+
]
|
1339 |
+
}
|
1340 |
+
],
|
1341 |
+
"source": [
|
1342 |
+
"import pprint\n",
|
1343 |
+
"pp = pprint.PrettyPrinter(width=80)\n",
|
1344 |
+
"pp.pprint(test_description)\n"
|
1345 |
+
]
|
1346 |
+
},
|
1347 |
+
{
|
1348 |
+
"cell_type": "code",
|
1349 |
+
"execution_count": null,
|
1350 |
+
"metadata": {},
|
1351 |
+
"outputs": [],
|
1352 |
+
"source": []
|
1353 |
+
}
|
1354 |
+
],
|
1355 |
+
"metadata": {
|
1356 |
+
"kernelspec": {
|
1357 |
+
"display_name": "du_ds_tools",
|
1358 |
+
"language": "python",
|
1359 |
+
"name": "python3"
|
1360 |
+
},
|
1361 |
+
"language_info": {
|
1362 |
+
"codemirror_mode": {
|
1363 |
+
"name": "ipython",
|
1364 |
+
"version": 3
|
1365 |
+
},
|
1366 |
+
"file_extension": ".py",
|
1367 |
+
"mimetype": "text/x-python",
|
1368 |
+
"name": "python",
|
1369 |
+
"nbconvert_exporter": "python",
|
1370 |
+
"pygments_lexer": "ipython3",
|
1371 |
+
"version": "3.11.9"
|
1372 |
+
}
|
1373 |
+
},
|
1374 |
+
"nbformat": 4,
|
1375 |
+
"nbformat_minor": 2
|
1376 |
+
}
|