Jump to content
Macro Express Forums

Sequential Google searches from a CSV file


Recommended Posts

I have a CSV file with a set of Google searches I want to make divided by line break. I'm trying to make a macro to do the search, wait for a confirmation (preferably an always-on-top pop-up box in the lower right corner, but ALT, SHIFT, or CTRL are also fine), close the window (if necessary), and continue to the next line. I found this post on the forums, but it's way too specific to the original poster's needs and I can't figure out what it's doing.

 

Alternatively, if there's another freeware program that is built to do the same thing, I'm fine with that as well.

Link to comment
Share on other sites

  • 2 weeks later...

Sorry it's taken me so long to respond. I've been very busy catching up on hours for clients after vacation.

 

When I started writing macros for hire my first clients wanted web scrapers to port data from one system to another. I had automated many web activities with a few page hits before that with success. But when I got into processing large batches I quickly learned how difficult it would be. If you can blindly fire away at a web page and are tolerant of a few misses every once in awhile MEP is a really good product. EG I have a macro for one client that will open the client's web page and navigate to a certain page. But when you try to do large numbers of iterations you will find it quickly goes off the rails. But it's not MEP's fault really. What happens is that you don't really have effective feedback from the web browser and eventually it will lag, something will happen to loose focus on the window, or any number of other things. I spent years working on methods to handle these things. Detecting if the page is truly loaded, whether we made it to a page, checking to see that the fields are all propagated properly and so forth. In most cases what was 50 line of core code became thousands to make it work reliably many times.

 

But it all depends on exactly what you are doing and how tolerant you are of misfires. If you are OK with longer run times open the web page and close it for each line of your CSV file. Starting fresh each time helps a lot. Also I would try to pass the search terms directly in the URI. I don't know if Google still support this as it' been years since I've done it this way. You can also do a "View Source" after it opens to get the raw HTML that you can parse. But give it a try and post back with your results and problems and we can go from there.

 

There is also a proper scraping program out there. I can't remember the name but I checked it out once but it wasn't exactly up to my needs. What I eventually did was to start using VBScripts to gather raw HTML and parse and eventually I went all in and learned VB.NET to build custom applications for scrapers. They all invisibly use the HTTPRequest/Response classes to invisibly grab the page contents and then I use RegEx to parse the data. It's all very cool and very fast. Depending on the web server on the other end and the type of requests I can often see 3 record a second with 2 page hits. EG 6 a second. This is important because in some cases we have over a half million to do. It took me some time to learn but It's very worth it. No timing errors, no browser load times, just pure data. If you are interested in something like this please feel free to visit my webpage and contact me or send a PM.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...