0

I am trying to capture a text from an element which might have two XPaths, but I can't make it happen.

I have tried all of the below methods with no luck and I'm getting this error when running my script once either of the XPath is not found.

Exception has occurred: NoSuchElementException Message: no such element: Unable to locate element:

The target element's XPath can be either the following, and the only difference is the middle div id (7 or 8).

  1. /div/div[2]/div[1]/div[7]/div[2]/span/div/span
  2. /div/div[2]/div[1]/div[8]/div[2]/span/div/span

Appreciate your help and suggestions,

if driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span'):
    antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span').text        
else:
    antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[8]/div[2]/span/div/span').text  
try:
    antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span').text      
except:
     antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[8]/div[2]/span/div/span').text
if len(driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span')) > 0:   
    antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span').text        
else:
    antivirus = driver.find_element("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[8]/div[2]/span/div/span').text  

The HTML part where I'm working on:

<div id="system" class="CardStyled">
<div><div> and more <div>
<div class="ant-row" style="margin-left: -10px; margin-right: -10px; margin-bottom: 2px; row-gap: 0px;"> 

    <div class="ant-col ant-col-12" style="padding-left: 10px; padding-right: 10px;"> 
        <span class="Title--13takrg">Antivirus Product</span> 

    </div> 
    <div class="ant-col ant-col-12" style="padding-left: 10px; padding-right: 10px;"> 

        <span>Windows ATP</span> 
    </div> 
</div> 
7
  • 1
    Is the URL of the site public? if yes share it in the question. Else you need to share the HTML of the targeted element and around it in the form of text. Pl. read How to create a Minimal, Reproducible Example
    – Shawn
    Commented Jul 2 at 13:11
  • 1
    The plural form should work if len(driver.find_elements("xpath",'//*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span')) > 0:
    – LMC
    Commented Jul 2 at 14:07
  • @LMC Many thanks, it worked by using the plural form of find_elements. Can you please explain how it makes a difference?
    – Arash
    Commented Jul 3 at 12:13
  • The singular form returns a single WebElement or throws error. The plural form returns a list of WebElements, empty if not found but no errors.
    – LMC
    Commented Jul 3 at 12:36
  • Is it possible to insert a variable by replacing div[number] with a variable and place it within a loop that checks 0 to 10?
    – Arash
    Commented Jul 4 at 2:29

1 Answer 1

0

You could try using the following single XPath expression to return the span element: (broken into multiple lines for clarity):

(
   //*[@id="system"]/div/div[2]/div[1]/div[position()=7 or position()=8]
      /div[2]/span/div/span
)[1]

or equivalently:

(
   //*[@id="system"]/div/div[2]/div[1]/div[7]/div[2]/span/div/span |
   //*[@id="system"]/div/div[2]/div[1]/div[8]/div[2]/span/div/span
)[1]

In both the examples, the expression inside the ( and ) will return span elements whose div ancestor was either the 7th or 8th div child of its parent div. Potentially you might get two such span elements; but the final [1] at the end of the expression, outside the ( ... ), will filter the list to return only the first of them.

8
  • But actually I suspect that you may be able to specify a different XPath query to return the span you want; a query which doesn't involve traversing the entire hierarchy of div containers. There are an infinite number of XPath expressions which will return the span you want, and it seems likely to me that some of those expressions will be more concise and clear than the expressions you've used so far. Commented Jul 13 at 12:42
  • Many thanks for the insights. The fact is that the target span is within heaps of div containers in a fully dynamic website. Although using the plural form of find_elements has solved the issue, can you please guide me in the right direction on how I can find a better way of finding the location of a span buried under tons of containers?
    – Arash
    Commented Jul 14 at 11:34
  • You're welcome @Arash ! Regarding the more general point about how to write more focused XPath expressions, it's hard to be specific without knowing the content you are working with, but you might want to check out this comment I made on an ancient question: stackoverflow.com/questions/3030487/… - in short, an XPath that identifies a particular element (such as your span) by virtue of what it does or what it means, rather than where it appears in the document is likely to be more robust and also simpler. Commented Jul 14 at 12:31
  • A couple of quick hints: 1) Use the descendant axis (or the short-hand //) to skip over many layers of container elements, rather than stepping through every one of those layers using the default child axis. 2) Query for a meaningful pattern of container elements, attributes, and content, e.g. //div[contains(@class, 'foo')][1]//span[preceding-sibling::label[1]='bar'] to find a particular div somewhere in the document, and within it, a particular span which follows a particular label. Commented Jul 14 at 13:18
  • Hi Conal, with your insights, I have learned much more about Xpath. I am also trying to use your suggested approach. Can you please guide me on how I can use the above code within driver.find_element if I want to extract the text file in the target span element? variable = driver.find_element("xpath",'(//*[@id="system"]/div/div[2]/div[1]/div[position()=7 or position()=8]/div[2]/span/div/span)[1]').text
    – Arash
    Commented Jul 16 at 6:26

Not the answer you're looking for? Browse other questions tagged or ask your own question.