convert scrap list to pandas dataframe using columns and index

process and data scrapping of url( within all given links in a loop )looks like :

for url in urls :
    #fetch and proceed page here and acquire cars info one per page  

and the output :


at the end how can i have a dataframe same as below by considering that i dunno number of car fields(columns) and number of cars(index) but defined df with them as columns and index


BMW  |red   50     120    200 
HONDA|blue  60            160   OMEGA  

any help appreciated 🙂


This approach is to create a list of dicts as we iterate through the urls, and then after the loop we convert this to a dictionary. I’m assuming that the car_table is always the column followed by the value over and over again

import pandas as pd
import numpy as np

#Creating lists from your output instead of requesting from the url since you didn't share that
car_names = ['BMW','FORD','HONDA']
car_tables = [
urls = range(len(car_names))

all_car_data = []
for url in urls:
    car_name = car_names[url] #using car_name instead of for this example
    car_table = car_tables[url] #again, you get this value some other way
    car_data = {'name':car_name}
    columns = car_table[::2] #starting from 0, skip every other entry to just get the columns
    values = car_table[1::2] #starting from 1, skip every other entry to just get the values
    #Zip the columns together with the values, then iterate and update the dict
    for col,val in zip(columns,values):
        car_data[col] = val
    #Add the dict to a list to keep track of all the cars
#Convert to a dataframe
df = pd.DataFrame(all_car_data)
#df = df.replace({'':np.NaN}) #you can use this if you want to replace the '' with NaNs


name    color   weight  height  width   serial  owner
0   BMW red 50kg    120cm   200cm       
1   FORD                        
2   HONDA   blue    60kg        160cm   OMEGA