Your question is too broad, however, I'll try to help you.
You can use a combination of: Python
+ Selenium
to browse and popular forms. If you need to collect information from open pages, you can also use the BeautifulSoup
library to extract the infos.
To start the browser:
driver = webdriver.Chrome()
To open a URL:
driver.get("www.url.com")
To find and fill a field with the value "MeuNome"
:
text = driver.find_element_by_xpath('//input[@id="primeiro_nome"]')
text.send_keys('MeuNome')
To press the continue button:
button = driver.find_element_by_xpath('//input[@id="btn_continuar"]')
button.click()
To collect the data of a page, you can create a soup
with the contents of the page opened in selenium webdriver:
html_source = driver.page_source
soup = BeautifulSoup(html_source, 'lxml')
After creating a database with the information to fill out, you can create a repeat structure to fill out the pages for each iteration:
for reg in registros:
nome = reg[0]
sobrenome = reg[1]
aniversario = reg[2]
nickname = reg[3]
abre_pagina()
preenche_dados()
finaliza_cadastro()