Saving links to a list to avoid Stale Element Error Selenium

I am clicking on a button on a page which loads the next page, there once again, I click on a button which opens a new window. After getting the data from there, I close the child window, return to the parent window and then again go back to the links page. The trouble is, I cannot save the links to a list because the links are like this <a href="javascript:jsDetalleDUA(&quot;40&quot;,&quot;410000009526&quot;,&quot;235&quot;,&quot;5947&quot;,&quot;4&quot;,&quot;2009&quot;,&quot;US&quot;);">LISTAR</a>

Of course, by then they have gone stale. So neither the click will work on this nor driver.get(url). Anyway around this?

link_l = driver.find_elements_by_xpath('//table[3]//a')
links = []
for li in link_l:
    link = li.get_attribute('href') #This will not work!
    links.append(link)
for i in range(len(links)):
    driver.get(links[i])
    driver.find_element_by_xpath('/html/body/form[1]/table[2]/tbody/tr/td/table/tbody/tr/td/table[3]/tbody/tr[2]/td[3]/a').click()
    all_windows = driver.window_handles
    child_window = [window for window in all_windows if window != parent_window][0]
    driver.switch_to.window(child_window)
    page = driver.page_source
    soup = BeautifulSoup(page, 'html.parser')
    tables = soup.findAll('table')
    dfs = []
    for table in tables:
        df_x = pd.read_html(str(table))
        print(df_x)
        df = pd.concat(df_x)
        dfs.append(df)
    df_f = pd.concat(dfs)
    df_f.to_csv(f'{i}result.csv')
    driver.close()
    driver.switch_to.window(parent_window)
    driver.back()

Answer

link_l = driver.find_elements_by_xpath('//table[3]//a')
for i in range(1,len(link_l)):
    driver.find_elements_by_xpath('//table[3]//a')[i].click()

Or this

driver.find_element_by_xpath('(//table[3]//a)[{}]'.format(i)).click()

You could probably use an xpath based approach while getting the len of the elements to click first and then back tracking.