Python Script alteration & Open ai API

Cerrado Publicado hace un año Pagado a la entrega
Cerrado Pagado a la entrega

I am looking for a Python developer to make alterations to my existing script and integrate it with the OpenAI ChatGPT API. The specific requirements for the project are as follows:

- Alteration of the input/output format in the Python script

- Integration of new functionalities as per the provided requirements and examples

- Optimization of the existing code for improved performance

Ideal skills and experience for this job include:

- Proficiency in Python programming

- Experience with OpenAI APIs, specifically ChatGPT

- Strong understanding of input/output formatting in Python scripts

- Ability to optimize code for improved performance

Here is the script from [login to view URL] - [login to view URL]

I want to bring the results for this into a text file instead sending it to the front end

from serpapi import GoogleSearch

import os

import requests

# import random

from fastapi import FastAPI

from dotenv import load_dotenv

from serpapi import GoogleSearch

from [login to view URL] import ThreadPoolExecutor

import asyncio

import newspaper

from bs4 import BeautifulSoup

from [login to view URL] import stopwords

from [login to view URL] import word_tokenize

from [login to view URL] import bigrams, trigrams

import nltk

from .[login to view URL] import add_post, save_links, get_post, get_all_post, remove_post, update_post

# Load env

load_dotenv()

SERPAPI_KEY = [login to view URL]('c0797abfd638908bf4cadfb3c2522a2865d2d39f33d0d627411dc858332f6329')

app = FastAPI()

# Setup Cors

origins = [

"http://localhost",

"http://localhost:3000",

]

app.add_middleware(

CORSMiddleware,

allow_origins=origins,

# allow_origins=["*"],

allow_credentials=True,

allow_methods=["*"],

allow_headers=["*"],

)

[login to view URL]('punkt')

[login to view URL]('stopwords')

# API Routes

@[login to view URL]("/")

def read_root():

return {"Hello": "World"}

@[login to view URL]('/search/')

def search(search: dict):

# 1. Get google search API

googleSearch = GoogleSearch({

"api_key": SERPAPI_KEY,

"engine": "google",

"num": 20,

"q": search['botox melbourne'],

"hl": search['english'],

"gl": search['melbourne'],

"google_domain": search['[login to view URL]'],

# "location": location,

})

result = googleSearch.get_dict()

#2. Get google autocomplete API

autocompleteSearch = GoogleSearch({

"api_key": SERPAPI_KEY,

"engine": "google_autocomplete",

"q": search['botox melbourne']

})

autocompleteResult = autocompleteSearch.get_dict()

result['autocomplete'] = autocompleteResult['suggestions']

post = add_post(

title = search['keyword'],

search_query = {

"q": search['keyword'],

"hl": search['searchLang'],

"gl": search['searchLocation'],

"googleDomain": search['googleDomain'],

},

search_result = result

)

return {

"post_id": [login to view URL],

"search_result": result

}

@[login to view URL]('/links/')

def saveLinks(linkData: dict):

post = save_links(

post_id = linkData['postId'],

choosen_links = linkData['choosenLinks']

)

return post

@[login to view URL]('/posts')

def getPosts():

posts = get_all_post()

return posts

@[login to view URL]('/posts/{post_id}')

def getPost(post_id: str):

post = get_post(post_id)

return post

@[login to view URL]('/posts/{post_id}')

def updatePost(post_id: str, data: dict):

post = update_post(post_id, data)

return post

@[login to view URL]('/posts/{post_id}')

def deletePost(post_id: str):

post = remove_post(post_id)

return post

@[login to view URL]('/scrape/{post_id}')

async def scrape(post_id: str):

post = get_post(post_id)

# Return early if exists

if post.links_scrape_result:

return {

"status": "success",

"data": post.links_scrape_result

}

# Scrape links if new

links = post.choosen_links

lang = post.search_query['hl']

contentInfo = []

allKeywords = []

with ThreadPoolExecutor() as executor:

tasks = [asyncio.get_running_loop().run_in_executor(executor, _scrape_article, link, lang) for link in links]

results = await [login to view URL](*tasks)

for result in results:

if result:

[login to view URL](result)

[login to view URL](result['keywords'])

topKeywords = _getTopKeywords(allKeywords)

averageWords = sum([content['totalWords'] for content in contentInfo]) / len(contentInfo)

links_scrape_result = {

"contentInfo": contentInfo,

"topKeywords": topKeywords,

"averageWords": averageWords

}

update_post(post_id, {

"links_scrape_result": links_scrape_result

})

return {

"status": "success",

"data": links_scrape_result

}

# Currently lang get from hl query parameter

# There's no guarantee it match available language at nltk

# [login to view URL]

# Newspaper Config

# When using proxy

# config = [login to view URL]()

# [login to view URL] = {

# 'http': '[login to view URL]' # sample only

# }

# config.request_timeout = 20

def _scrape_article(link, lang):

# Add config=config in paramter if adding proxy/user_agents

# article = [login to view URL](link, keep_article_html=True, config=config)

article = [login to view URL](link, keep_article_html=True)

content = None

try:

[login to view URL]()

[login to view URL]()

content = [login to view URL]

content_html = article.article_html

content_title = [login to view URL]

# Current logic to detect if it's blocked

# is by check if text results is very short

if len(content) < 100:

print("Scraping might be blocked! In case expection block not catching up")

print('---- link ---- \n' + link)

# Scrape from golang endpoint

endpoint = "http://api-golang:8080/scrape?link=" + link

response = [login to view URL](endpoint)

if response.status_code == 200:

data = [login to view URL]()

content = data['content']

content_html = data['content_html']

else:

print(f"Error {response.status_code}: {[login to view URL]}")

soup = BeautifulSoup(content_html, '[login to view URL]')

headings = []

for heading in soup.find_all(['h1', 'h2', 'h3']):

[login to view URL]({

"text": [login to view URL],

"tag": [login to view URL]

})

except:

print("-----------")

print("Error scrape. Todo: run manual fetch ", link)

print("-----------")

return None

if content == None:

return None

_lang = 'english'

if lang == 'id':

_lang = 'english'

# later add more language, mapping hl query -> nltk language

# [login to view URL]

text = [login to view URL]()

word_tokens = word_tokenize(text)

stop_words = set([login to view URL](_lang))

filtered_text = [word for word in word_tokens if [login to view URL]() and word not in stop_words]

single_freq_dist = [login to view URL](filtered_text)

bigram_freq_dist = [login to view URL](bigrams(filtered_text))

trigram_freq_dist = [login to view URL](trigrams(filtered_text))

# combine all frequency distribution

freq_dist = single_freq_dist + bigram_freq_dist + trigram_freq_dist

keywords = []

for word, frequency in freq_dist.most_common(10):

[login to view URL]({

"word": word,

"frequency": frequency

})

totalWords = len([login to view URL]())

return {

"link": link,

"totalWords": totalWords,

"keywords": keywords,

"title": content_title,

"headings": headings,

}

def _getTopKeywords(allKeywords):

MAX_KEYWORD = 15

topKeywords = {}

for word_dict in allKeywords:

# if word_dict['word'] is tuple, convert it to string

_word = word_dict['word']

if isinstance(_word, tuple):

_word = ' '.join(word_dict['word'])

word = [login to view URL]()

if word in topKeywords:

topKeywords[word] += word_dict['frequency']

else:

topKeywords[word] = word_dict['frequency']

# sort the dictionary by the frequency in descending order and get the first 10 items

sorted_topKeywords = sorted([login to view URL](), key=[login to view URL](1), reverse=True)[:MAX_KEYWORD]

return sorted_topKeywords

______

We need to use the [login to view URL] app on git hub

utilise the results of 2 aspects and store them in text file

And then be able to use those results in another python script.

Can the team help thanks

Python GitHub SQL

Nº del proyecto: #37631624

Sobre el proyecto

38 propuestas Proyecto remoto Activo hace 11 meses

38 freelancers están ofertando un promedio de $168 por este trabajo

divumanocha

Hello Greetings, After going through your project description, I feel confident and excited to work on this project for you. But I have some crucial things and queries to clear out. Can you please leave a message on Más

$250 AUD en 4 días
(7 comentarios)
5.4
tangramua

Hello, We are 25 years in this business and our technical specialists have strong experience in Full Stack Development, Python, SQL, GitHub and other technologies related to your project. We carefully studied the des Más

$301 AUD en 6 días
(40 comentarios)
7.1
mobimubasir

Hello. I read your requirement i will do that. Please come on chat we will discuss more about this. I will waiting your reply.

$100 AUD en 1 día
(25 comentarios)
5.1
rashidamjad

Hi there, I'm thrilled to apply for your Python Script alteration & Open ai API project. With 4-5 years of experience in GitHub, Python and SQL, I'm confident in my ability to bring valuable insights and expertise to Más

$250 AUD en 8 días
(7 comentarios)
4.8
Sidrairfan078

Hello sir, I hope you are good. I have read your job description, its doable job as per my experience and knowledge. I want to ask you few questions about job description. I am full stack developer having a good experi Más

$155 AUD en 8 días
(1 comentario)
4.8
arbu1499

hello! after carefully reviewing your project and provided code, I am offering my assistance as a Python developer to integrate your existing script with the OpenAI ChatGPT API. I understand the specific requirements. Más

$150 AUD en 2 días
(10 comentarios)
4.4
ITMed

Hi there, I am excited to share my expertise and skills in python and OpenAI, which I have acquired over the past 3 years. I am confident that I can meet your requirements. I would be delighted to work with you and I l Más

$140 AUD en 1 día
(9 comentarios)
3.8
narukaconsultan5

Hi, I can help you to make alterations to your existing script and integrate it with the OpenAI ChatGPT API. message me for further discussion.

$60 AUD en 3 días
(7 comentarios)
3.6
malkesh3m

⭐ Hi, My availability is immediate. I read your project post on Python Developer. We are experienced full-stack Python developers with skill sets in - Python, Django, Flask, FastAPI, Jupyter Notebook, Selenium, Data V Más

$190 AUD en 2 días
(13 comentarios)
3.7
MohmedAbdelwahab

As a Python developer well-versed in SQL, I'm ready to level up your Python script to meet your requirements in integrating with the OpenAI ChatGPT API and altering the input/output format. My expertise in Python, Pand Más

$140 AUD en 7 días
(7 comentarios)
3.2
shasanulgoni

===== AVAILABLE FOR IMMEDIATE WORKING ======= Hi! I can make alterations to my existing script and integrate it with the OpenAI ChatGPT API. I'm working(part-time) as a full-stack developer for E-Vision Software Ltd, Más

$100 AUD en 1 día
(3 comentarios)
2.6
IcsfIT789

Hi dear client, I am a Data Scientist has great knowledge an enthusiasm in Python program Development. I can write clean, validated python code and make a device-supported .py File. I have over 10-years of experience w Más

$50 AUD en 3 días
(2 comentarios)
2.2
vipuls22

I am a data scientist with 2.5+ years of relevant experience and extensive knowledge of Python, NumPy, Pandas, PyTorch, Keras, sci-kit-learn, and TensorFlow. I have successfully implemented object detection algorithms Más

$80 AUD en 7 días
(1 comentario)
2.2
DEVCIR

Hey I can setup the swift script and alter it to share result in text file instead of sending it to frontend. I will update the format of file so that the data can be populated/fed to other script and can integrate the Más

$250 AUD en 7 días
(1 comentario)
1.8
paulbasht

Hi, Hoping you doing well. I'm glad to submit my proposal for your project. I have 5 years of experience with developing Python project. I have developed Data Entry programs and Web Scraping programs. And also I have h Más

$140 AUD en 7 días
(1 comentario)
1.2
dukicsmiljana

Hello there Vihang S., Good morning! I’ve carefully checked your requirements and really interested in this job. I’m full stack node.js developer working at large-scale apps as a lead developer with U.S. and European Más

$30 AUD en 4 días
(0 comentarios)
0.0
VadymLes

❤️Hi Vihang S.❤️ ~101% satisfaction will be here~ I have carefully read your requirements and understand what you are looking for. This opportunity interests me because I have extensive experience and deep knowledge Más

$120 AUD en 2 días
(0 comentarios)
0.0
zardecka1984

Hello Vihang S., I’ve carefully checked your requirements and really interested in this job. I can complete your project on time and your will experience great satisfaction with me. I have rich experienced in SQL, Pyt Más

$150 AUD en 2 días
(0 comentarios)
0.0
toku3906

Hello,I am lucky to meet you today! Welcome. As a highly skilled Python/OpenAI api developer, I can help you perfectly. I am very confident with my skills and I'd like to help your business by doing my best. My clients Más

$500 AUD en 1 día
(0 comentarios)
0.0
abdullahmaqsoo58

Hello Vihang S., I went through your project description and it seems like am great fit for this job. As an expert who have many years of experience on Python, SQL, GitHub Please come over chat and discuss your requ Más

$140 AUD en 7 días
(0 comentarios)
0.0