Hey Thomas, without being specific(in regards to your sister) what type of things were you automating for her? does she have a website and your backing it up, uploading her files, scraping some data and importing it in? I'm just wondering as your saying that is faster and you don't have to slow it down so it'd be interesting to see what it means. With the scraping, say, from Google results that you did a few years ago, I wonder how that does compares to now? ButDerrinallum sounds great and glad that your able to incorporate it into automation. .. this builder is becoming a monster lol
Scraping court schedules for a list of judges on a local government site. My sister is a lawyer/supervisor so she needed to know what cases her and her team were scheduled for during the week.
It took my sister a very long time to do it by hand.
Had to grab links to all the judges schedules. Something like 8 or so judges for her area. Then loop through the links, one by one, in order to get the specific Judge's schedule. It has a date field at the top of the page, so I needed to set that field to Monday (of the week) and click enter. The cases will show up in a table. I scraped the table looking for specific letters, within the case number, and then click the next button to load the next day. There can be multiple rows for the same case number so I had to combine the data when the case number was the same into one row. Keep doing it until the script reaches Saturday. When that happens, I go to the next judge and repeat.
I save all cases and the names of the people/organizations to a database. It all shows up in the grid we typically use so she can search for specific cases throughout the week.
I also included a way for her to save everything to a spreadsheet so she can share with her coworkers. It creates a spreadsheet with tabs for each Judge. So my sister can click on the tab, with the judges name, and get the list of cases for that judge.
The website is slow when loading the cases. It also doesn't refresh the whole page. Only refreshes the table which made it harder to know when the data was ready for me to scrape. So clicking the next button required me to wait for the table to load with new data. I had to add specific waits (tells the script to wait for a certain amount of time) throughout my script in order to get it to work. The new framework was much easier. I think I had to add one wait action in order to insure the cases were loaded. Pretty good compared to like 3 to 4 in the older framework. Old version took close to 10 minutes to run. New version around 3 minutes.
Yeah, I like where we are headed. I hated the original web automation and needed to do something about it. Feels good. lol
I see some major money to be made from creating automation programs like this for small businesses. I was just telling a friend that I may build a consultancy business around automation with ESB. This should be steady cashflow as the businesses will always need you to tweak the program if a site changes or their requirements change (which seems to be a lot).
Most people will not want to build a program with ESB, but they would definitely buy one made for them.
Thomas