Saturday, 11 July 2015

SFTW: Scraping data with Google Refine

For the first Something For The Weekend of 2012 I want to tackle a common problem when you’re trying to scrape a collection of webpage: they have some sort of structure in their URL like this, where part of the URL refers to the name or code of an entity:     http://www.ltscotland.org.uk/scottishschoolsonline/schools/freemealentitlement.asp?iSchoolID=5237521

  tp://www.ltscotland.org.uk/scottishschoolsonline/schools/freemealentitlement.asp?iSchoolID=5237629

    ttp://www.ltscotland.org.uk/scottishschoolsonline/schools/freemealentitlement.asp?iSchoolID=5237823

In this instance, you can see that the URL is identical apart from a 7 digit code at the end: the ID of the school the data refers to.

There are a number of ways you could scrape this data. You could use Google Docs and the =importXML formula, but Google Docs will only let you use this 50 times on any one spreadsheet (you could copy the results and select Edit > Paste Special > Values Only and then use the formula a further 50 times if it’s not too many – here’s one I prepared earlier).

And you could use Scraperwiki to write a powerful scraper – but you need to understand enough coding to do so quickly (here’s a demo I prepared earlier).

A middle option is to use Google Refine, and here’s how you do it.

Assembling the ingredients

With the basic URL structure identified, we already have half of our ingredients. What we need  next is a list of the ID codes that we’re going to use to complete each URL.

An advanced search for “list seed number scottish schools filetype:xls” brings up a link to this spreadsheet (XLS) which gives us just that.

The spreadsheet will need editing: remove any rows you don’t need. This will reduce the time that the scraper will take in going through them. For example, if you’re only interested in one local authority, or one type of school, sort your spreadsheet so that you can delete those above or below them.

Now to combine  the ID codes with the base URL.

Bringing your data into Google Refine

Open Google Refine and create a new project with the edited spreadsheet containing the school IDs.

At the top of the school ID column click on the drop-down menu and select Edit column > Add column based on this column…

In the New column name box at the top call this ‘URL’.

In the Expression box type the following piece of GREL (Google Refine Expression Language):

“http://www.ltscotland.org.uk/scottishschoolsonline/schools/freemealentitlement.asp?iSchoolID=”+value

(Type in the quotation marks yourself – if you’re copying them from a webpage you may have problems)

The ‘value’ bit means the value of each cell in the column you just selected. The plus sign adds it to the end of the URL in quotes.

In the Preview window you should see the results – you can even copy one of the resulting URLs and paste it into a browser to check it works. (On one occasion Google Refine added .0 to the end of the ID number, ruining the URL. You can solve this by changing ‘value’ to value.substring(0,7) – this extracts the first 7 characters of the ID number, omitting the ‘.0') UPDATE: in the comment Thad suggests “perhaps, upon import of your spreadsheet of IDs, you forgot to uncheck the importer option to Parse as numbers?”

Click OK if you’re happy, and you should have a new column with a URL for each school ID.

Grabbing the HTML for each page

Now click on the top of this new URL column and select Edit column > Add column by fetching URLs…

In the New column name box at the top call this ‘HTML’.

All you need in the Expression window is ‘value’, so leave that as it is.

Click OK.

Google Refine will now go to each of those URLs and fetch the HTML contents. As we have a couple thousand rows here, this will take a long time – hours, depending on the speed of your computer and internet connection (it may not work at all if either isn’t very fast). So leave it running and come back to it later.

Extracting data from the raw HTML with parseHTML

When it’s finished you’ll have another column where each cell is a bunch of HTML. You’ll need to create a new column to extract what you need from that, and you’ll also need some GREL expressions explained here.

First you need to identify what data you want, and where it is in the HTML. To find it, right-click on one of the webpages containing the data, and search for a key phrase or figure that you want to extract. Around that data you want to find a HTML tag like <table class=”destinations”> or <div id=”statistics”>. Keep that open in another window while you tweak the expression we come onto below…

Back in Google Refine, at the top of the HTML column click on the drop-down menu and select Edit column > Add column based on this column…

In the New column name box at the top give it a name describing the data you’re going to pull out.

In the Expression box type the following piece of GREL (Google Refine Expression Language):

value.parseHtml().select(“table.destinations”)[0].select(“tr”).toString()

(Again, type the quotation marks yourself rather than copying them from here or you may have problems)

I’ll break down what this is doing:

value.parseHtml()

parse the HTML in each cell (value)

.select(“table.destinations”)

find a table with a class (.) of “destinations” (in the source HTML this reads <table class=”destinations”>. If it was <div id=”statistics”> then you would write .select(“div#statistics”) – the hash sign representing an ‘id’ and the full stop representing a ‘class’.

[0]

This zero in square brackets tells Refine to only grab the first table – a number 1 would indicate the second, and so on. This is because numbering (“indexing”) generally begins with zero in programming.

.select(“tr”)

Now, within that table, find anything within the tag <tr>

.toString()

And convert the results into a string of text.

The results of that expression in the Preview window should look something like this:

<tr> <th></th> <th>Abbotswell School</th> <th>Aberdeen City</th> <th>Scotland</th> </tr> <tr> <th>Percentage of pupils</th> <td>25.5%</td> <td>16.3%</td> <td>22.6%</td> </tr>

This is still HTML, but a much smaller and manageable chunk. You could, if you chose, now export it as a spreadsheet file and use various techniques to get rid of the tags (Find and Replace, for example) and split the data into separate columns (the =SPLIT formula, for example).

Or you could further tweak your GREL code in Refine to drill further into your data, like so:

value.parseHtml().select(“table.destinations”)[0].select(“td”)[0].toString()

Which would give you this:

<td>25.5%</td>

Or you can add the .substring function to strip out the HTML like so (assuming that the data you want is always 5 characters long):

value.parseHtml().select(“table.destinations”)[0].select(“td”)[0].toString().substring(5,10)

When you’re happy, click OK and you should have a new column for that data. You can repeat this for every piece of data you want to extract into a new column.

Then click Export in the upper right corner and save as a CSV or Excel file.

Source: http://onlinejournalismblog.com/2012/01/13/sftw-scraping-data-with-google-refine/

Friday, 26 June 2015

Data Scraping - Hand Scraped Hardwood Flooring Gives Your Home That Exclusive Look

Today hand scraped hardwood flooring is becoming extremely popular in the more opulent homes as well as in some commercial properties. Although this type of flooring has only recently become fashionable it has been around for many centuries.

Certainly before the invention of modern sanding techniques all floors where hand scraped at the location where they were to be installed to ensure that the floor would be flat and even. However today this method is used instead to provide texture, richness as well as a unique look and feel to the flooring.

Although manufacturers have produced machines which can provide a scraped look to their flooring it looks cheap compared to the real thing. Unfortunately the main problem with using a machine to scrape the flooring is that it provides a uniform look to the pattern of the wood. Because of this it lacks the natural feel that you would see with a floor which has been scraped by hand.

When done by hand, scraping creates a truly unique look to the floor. However the actual look and feel of each floor will vary as it depends on the skills of the person actually carrying out the work. If there is no control in place whilst the work is being carried out this can result in disastrous look to the finished product.

Many manufacturers who actually provide hand scraped hardwood flooring will either just dent, scoop or rough the floor up. But others will use sanding techniques in order to create a worn and uneven look to the flooring. The more professional teams will scrape the entire surface of the wood in order to create the unique hand made look for their customers.

Many companies will allow their customers to choose what type of scraping takes place on their wood. They can choose between light, medium and heavy. The companies who are really good at hand scraping will be able give the hardwood floor a reclaimed look by including wormholes, splits and other naturally-occurring features within the wood.

If you do decide to choose hand scraped hardwood flooring you will need to factor the costs that are associated with it into your budget. Unfortunately this type of flooring does not come cheap and you can find yourself paying upwards of $15 per sq ft. But once it is installed it will give a room a unique and warm rich feel to it and is certainly going to wow your friends and family when they see it for the first time.

Source: http://ezinearticles.com/?Hand-Scraped-Hardwood-Flooring-Gives-Your-Home-That-Exclusive-Look&id=572577

Saturday, 20 June 2015

Making data on the web useful: scraping

Introduction

Many times data is not easily accessible – although it does exist. As much as we wish everything was available in CSV or the format of our choice – most data is published in different forms on the web. What if you want to use the data to combine it with other datasets and explore it independently?

Scraping to the rescue!

Scraping describes the method to extract data hidden in documents – such as Web Pages and PDFs and make it useable for further processing. It is among the most useful skills if you set out to investigate data – and most of the time it’s not especially challenging. For the most simple ways of scraping you don’t even need to know how to write code.

This example relies heavily on Google Chrome for the first part. Some things work well with other browsers, however we will be using one specific browser extension only available on Chrome. If you can’t install Chrome, don’t worry the principles remain similar.

Code-free Scraping in 5 minutes using Google Spreadsheets & Google Chrome

Knowing the structure of a website is the first step towards extracting and using the data. Let’s get our data into a spreadsheet – so we can use it further. An easy way to do this is provided by a special formula in Google Spreadsheets.

Save yourselves hours of time in copy-paste agony with the ImportHTML command in Google Spreadsheets. It really is magic!

Recipes

In order to complete the next challenge, take a look in the Handbook at one of the following recipes:

    Extracting data from HTML tables.

    Scraping using the Scraper Extension for Chrome

Both methods are useful for:

    Extracting individual lists or tables from single webpages

The latter can do slightly more complex tasks, such as extracting nested information. Take a look at the recipe for more details.

Neither will work for:

    Extracting data spread across multiple webpages

Challenge

Task: Find a website with a table and scrape the information from it. Share your result on datahub.io (make sure to tag your dataset with schoolofdata.org)

Tip

Once you’ve got your table into the spreadsheet, you may want to move it around, or put it in another sheet. Right click the top left cell and select “paste special” – “paste values only”.

Scraping more than one webpage: Scraperwiki

Note: Before proceeding into full scraping mode, it’s helpful to understand the flesh and bones of what makes up a webpage. Read the Introduction to HTML recipe in the handbook.

Until now we’ve only scraped data from a single webpage. What if there are more? Or you want to scrape complex databases? You’ll need to learn how to program – at least a bit.

It’s beyond the scope of this course to teach how to scrape, our aim here is to help you understand whether it is worth investing your time to learn, and to point you at some useful resources to help you on your way!

Structure of a scraper

Scrapers are comprised of three core parts:

1.    A queue of pages to scrape
2.    An area for structured data to be stored, such as a database
3.    A downloader and parser that adds URLs to the queue and/or structured information to the database.

Fortunately for you there is a good website for programming scrapers: ScraperWiki.com

ScraperWiki has two main functions: You can write scrapers – which are optionally run regularly and the data is available to everyone visiting – or you can request them to write scrapers for you. The latter costs some money – however it helps to contact the Scraperwiki community (Google Group) someone might get excited about your project and help you!.

If you are interested in writing scrapers with Scraperwiki, check out this sample scraper – scraping some data about Parliament. Click View source to see the details. Also check out the Scraperwiki documentation: https://scraperwiki.com/docs/python/

When should I make the investment to learn how to scrape?

A few reasons (non-exhaustive list!):

1.    If you regularly have to extract data where there are numerous tables in one page.

2.    If your information is spread across numerous pages.

3.    If you want to run the scraper regularly (e.g. if information is released every week or month).

4.    If you want things like email alerts if information on a particular webpage changes.

…And you don’t want to pay someone else to do it for you!

Summary:

In this course we’ve covered Web scraping and how to extract data from websites. The main function of scraping is to convert data that is semi-structured into structured data and make it easily useable for further processing. While this is a relatively simple task with a bit of programming – for single webpages it is also feasible without any programming at all. We’ve introduced =importHTML and the Scraper extension for your scraping needs.

Further Reading

1.    Scraping for Journalism: A Guide for Collecting Data: ProPublica Guides

2.    Scraping for Journalists (ebook): Paul Bradshaw

3.    Scrape the Web: Strategies for programming websites that don’t expect it : Talk from PyCon

4.    An Introduction to Compassionate Screen Scraping: Will Larson

Any questions? Got stuck? Ask School of Data!

ScraperWiki has two main functions: You can write scrapers – which are optionally run regularly and the data is available to everyone visiting – or you can request them to write scrapers for you. The latter costs some money – however it helps to contact the Scraperwiki community (Google Group) someone might get excited about your project and help you!.

If you are interested in writing scrapers with Scraperwiki, check out this sample scraper – scraping some data about Parliament. Click View source to see the details. Also check out the Scraperwiki documentation: https://scraperwiki.com/docs/python/

When should I make the investment to learn how to scrape?

A few reasons (non-exhaustive list!):

1.    If you regularly have to extract data where there are numerous tables in one page.

2.    If your information is spread across numerous pages.

3.    If you want to run the scraper regularly (e.g. if information is released every week or month).

4.    If you want things like email alerts if information on a particular webpage changes.

…And you don’t want to pay someone else to do it for you!

Summary:

In this course we’ve covered Web scraping and how to extract data from websites. The main function of scraping is to convert data that is semi-structured into structured data and make it easily useable for further processing. While this is a relatively simple task with a bit of programming – for single webpages it is also feasible without any programming at all. We’ve introduced =importHTML and the Scraper extension for your scraping needs.

Source: http://schoolofdata.org/handbook/courses/scraping/

Sunday, 24 May 2015

Local Search and Internet Yellow Pages - A Whole New Vocabulary for Small Businesss

Buyers want both online and local information about where to buy Most small businesses are local in nature, serving people who live nearby. Their customers found them through traditional methods like the Yellow Pages or newspaper ads. So far, the Internet hasn't figured prominently in their marketing efforts. That's about to change, as Local Search methods become more widespread.

Even for buyers expecting to spend their money close to home, more and more of them go to the Internet to locate desired products and services. They rely on search engines to find suitable vendors in the fastest, easiest way. Local Search combines the search query word or phrase with specific geographic terms, like city or zip code. That way, search results only include enterprises in that local area.

Instead of information about a small enterprise being lost among millions of pages of search results, it shows up in a small pool of local providers. That's good for them, as well as the person looking for what they provide.

Small operations can easily be located by a whole new group of buyers Consumers don't simply go to the Yellow Pages when ready to buy - as they once did. Studies show that an astonishing 36% of online searches are conducted to find local businesses. About a quarter of all Internet users already conduct local searches. They'd do even more of it, if the desired small business data were more complete.

Local enterprises need to prepare for the impact of changing customer habits. An easy first step is to include your business in Internet Yellow Pages (IYP), along with the printed Yellow Page directory. That puts your enterprise on the radar screen.

You'll find reliable advice from experts in Yellow Pages and Local Search so you can get more mileage from your promotional dollars. Start by getting comfortable with search concepts, and improve your odds of being found when people search online for what you offer. You don't even need your own Web site to benefit from Internet Yellow Pages and Local Search.

Learn the Relevant Terms

Search Engine - method for locating the information available on the Internet; a program that searches Web pages for requested keywords, then returns a list of documents where the query terms were found Google and Yahoo, the major general search engines, have both shifted gears to make Local Search a priority when delivering relevant results.

Spider (also called "crawler" or "bot") - goes to every page on every Web site and reads the information so it can be available to searchers; to "crawl" a site it collects and indexes information from it

Specialized Search Engines - narrow focus of information crawled and indexed, like medical, business, or shopping sites

Keywords - word or phrases used by search engines to locate relevant Web pages; words chosen to improve a site's search engine placement and ranking

Search Query - search request, which the search engine compares to the spidered entries, then returns results to the searcher

Search Results - compiled list of Web pages that a search engine delivers in response to a query; the number of items returned is usually overwhelming (in the millions), so searchers only bother to view results on the first pages

Relevant Results - the test of a good search is whether the results obtained relate to what the person wanted to find, without a lot of irrelevant links

Local Search - combining a geographic term in a search query to locate suitable providers in a specific area

Pay per Click (PPC) - method of building traffic whereby site owners bid on search terms (keywords) that link to their site

Geographic Terms - specific information about the local area that can be included in a local search: zip code, town, county, geographic region, state

Top Ranking - sites shown on the first page(s) of search results

Search Engine Optimization (SEO) - fine-tuning keywords and page content so the Web site rates high in search engine results

Tags and Titles (on Web Pages) - provide site keywords and information to search engine spiders for indexing a site

Internet Yellow Pages (IYP) - directory of business phone numbers and locations in a geographic area, organized by category; searchable data base accessed on the Internet. Learn how your business can make the most of Local Search 

Make your business easy for searchers to find

The public is embracing the convenience of searching on the Internet to find information about local businesses.

However, their searches for desired information are compromised because so many local enterprises don't show up in the databases as yet. Those that do have an edge in their local market. Climb aboard! Make sure searchers can find you. For little or no money, you can expose your enterprise to the whole world.

Whether or not your business has a Web site, you need to provide the information people are looking for in the places that they look for it. Local Search and Internet Yellow Pages open new avenues to buyers ready to spend. Best of all, they support and compliment your traditional methods of finding new business. So you cover all your bases. (c) 2007, Lynella Grant

Dr. Lynella Grant Author, Yellow Page Smarts - Smarter and more effective ways to attract more YP customers in the Directory and on the internet

Source: http://ezinearticles.com/?Local-Search-and-Internet-Yellow-Pages---A-Whole-New-Vocabulary-for-Small-Businesss&id=1894

Monday, 18 May 2015

Media, the Internet, Yellow Pages, and Your Business

If you are reading this article, chances are you could use a little extra money. With the advent of the internet and the migration of advertising dollars from print to electronic (and this time, it's the real thing, I swear! Not one of those 1999 tech busts!...Seriously!) If you own a small business today, you look at many advertising mediums. The majority of these mediums lump themselves into 2 categories, creative or direct.

Creative has always been the crapshoot for the small business owner. A sales rep walks into your business, espousing the greater good of television or radio advertising, quickly moves past the ratings, viewers etc and into the sexiness of hearing your name at 6:57am Monday, Thursday and Saturday if you are watching station X or listening to station Y. If this product didn't work, a Super Bowl commercial price tag wouldn't make headlines every December (for how much Geico paid) or late February (to hear which is most memorable). The key with creative is frequency. If you have realistic budget for frequency, you can make the phone ring with a creative campaign. If you have that budget you probably aren't reading this article. Realistically speaking, you don't have a ton of money to risk on creative advertising effectiveness, haven't backed it up with a call to action, and you need, pound for pound, the least amount of advertising money possible, with the most phone calls...

Enter direct advertising. Classified sections in newspapers, they make your phone ring, if you're selling something people want. (For the record, advertising in the sports section of your local paper is creative advertising (people don't go to page 5 of the sports to regularly check out the latest prices on used cars.) Classified advertising is in the process of going from the newspaper industry's cash cow to taking it on the chin from EBay (ever heard of it?) and even more attractive small town slugger, Craigslist (you go Craig!). If you're business pumps out used cars by the pound, chances are you, or your salesmen are using these two websites to start realizing savings from Rupert Murdoch and his yacht-owning cronies. Even the best of EBay or Craigslist, however, doesn't put much of a dent in your P&L statement if you are service based like a contractor, or general retail, like a bookstore.

Enter the yellow pages...Pound for pound, no other medium makes the phone ring at your business like the good old fashioned yellow pages. Throw down your money, and answer the phone. You already know that. So do all the TV stations, radio stations, and newspapers in the country. The best protected advertising budget in any small business is the yellow page budget. Yellow pages are the scourge of the other guys. How many radio sales reps will walk into your store after you started your advertising campaign and say, "Tom, your $1,000 invested with my station this week got you 48 phone calls?" (If you find a station like that please send me the phone number, and I retract everything I said earlier) When someone wants a plumber, a pool boy, a new pool, or a divorce attorney for getting the new pool without his wife's approval, they pick up the weathered old yellow pages, leaf through a few adverts, and call someone that sells what they need.

So, am I telling you to advertise in the yellow pages?,,, Not so fast Skippy...First, let's look at the cost of the yellow pages,...You want the phone to ring in Miami, and you're a plumber? Better be ready to pony up some serious cash...say $3-4k per month. In Miami, the average cost of a service call could be around $65. If you don't have a crew, that ad needs to generate 61 calls to break even, not including the employee cost, travel costs etc. Not so bad? How many calls did you need to generate those 61 service calls? Did you go see everyone that called you? I would guess, for a contractor, you might get lucky and have a 50% close rate...122 calls...to break even. Don't forget to pay yourself...200 calls. Depressed? Better be glad you don't sell shoes. The same ad would generate a much lower close rate, and you need to sell an dump truck of shoes every month!

What's my point? Enter the ELECTRONIC yellow pages...No print bill, real time changes, and guess where all those print yellow pages are putting their money these days? BellSouth and SBC just paid $100M (you know, $100,000,000) for a new domain name, and combined their "competing" forces to make a better entry in the fray, thinking that you might remember yellowpages.com better than smartpages.com or realpages.com. (Makes you wonder where Google fits into the old branding and name recognition game.) Verizon seemed to get the concept a little better with superpages.com by aligning with Mr. Gates over at MSN right around the time Al Gore was inventing the internet. Getting back to the point, the internet yellow pages are going to do to print yellow pages what EBay and Craigslist have done to the newspaper companies. No paper, no ink, usage climbing (for electronic yellow pages, usage is climbing to as high as 70% of online searching, and buying) and real-time, do-it-yourself advertising. Advents such as community ranking, mini-sites, toolbars, pay per click, pay per call, and just about every way possible to pay for performance, track performance, and see what other buyers of your goods or services thought of your business. Due to the ever changing, "who's in first place" of the internet, there has yet to be determined if there is a Lance Armstrong in this race. Our own company USdirectory.com, via its partnerships, and investment into technology, is looking to become a late entry, blue-ribbon bearer. At this point, it's too early to clearly point out which one, or all, or none, of these companies will do to yellow pages what Google did to global search. That being said, even Google doesn't reflect enough tenure to ensure its own top position.

Who wins?? You do, the business owner. Technology is about to reduce your advertising budget the way Southwest and JetBlue changed the airline industry. Your customer base, as they migrate to the internet as vehicle of choice, will reach you at lower price points, and in greater volume, then ever before. Your mission, should you choose to accept, in investing in the right mix, at the right point, and try to cater not only to your existing radius of business, but around the planet with new and specialized niches...but that's another story.

Source: http://ezinearticles.com/?Media,-the-Internet,-Yellow-Pages,-and-Your-Business&id=56566

Friday, 15 May 2015

Combine Your Yellow Page Ad and Web Site for Maximum Profits

A Yellow Page Ad isn't Enough Any More

An unquestioned "must" for any small business has been to run an ad in the Yellow Page Directory. Since most customers were local, that was enough to establish itself as "open for business." The annual Yellow Page ad represents the largest promotional expense for many enterprises. Yet, Yellow Page directory use is declining, while expanding segments of the public don't rely on them at all. Yellow Page advertising costs keep going up, and the complicated pricing structure is difficult to figure out. Worse yet, having a Yellow Page ad doesn't deliver like it used to.

People can find most of the information they want without ever opening a directory. Your business needs its Yellow Page strategy to be in tune with the times and your market. Like most business owners, you must squeeze maximum value from every promotional dollar spent. That requires you to move beyond treating a Yellow Page ad like it's a separate, stand-alone way to promote your business. It's not.

Your Yellow Page advertising needs to work in tandem with all the rest of the efforts you pursue. The Internet Expands Your Arena Every business needs to put itself in front of the people looking for what it does - and that's not just through the Yellow Pages any longer.

An increasing percentage of customers, who spend their money close to home, are Internet savvy. There's a major overlap between Yellow Page directory users and Internet users. That fact supports integrating your local and Internet promotional methods so they attract more new customers. Yellow Page users are likely to be Internet users as well. And a business that ignores online activities entirely may have a tough time getting access to or credibility with those customers. It is possible to make online and traditional (off-line) methods to attract customers work in tandem - improving the effectiveness of each alone.

So it's no longer an either-or, all-or-none choice whether to promote the business online or off. People who subscribe to online services consult the Yellow Pages 23% more often than non-subscribers.

Frequent Yellow Page Users are:

- 18% more likely than average to be Internet subscribers

- 32% more likely to be among the heaviest Internet users

- 18% more likely to make purchases on the Internet

- 27% more likely to spend more than $1,000 on Internet purchases

Source: Simmons

Customer Behavior is Changing

More and more, people are going to the Internet to find, learn about, or select products and services. Even local ones. That doesn't mean that they will buy online, however. People still prefer to spend their money locally when they can. But, even the smallest business can do a better job of being found by those who prefer to use both the Internet and the Yellow Page directory to make their buying decisions. And, it can be done very inexpensively, too. Even a 100% local business can pull in more business by getting its low-tech and high-tech advertising to mesh.

What Else has Changed?

- Buyers are less trusting and more willing to shop around

- Customers have more options and ways to find what they want

- Availability of Internet Yellow Pages

- Aging population uses the Yellow Pages differently than young people

- Development of unique niches and specialties

- More choices for a "better deal"

- More directories competing in a geographic area

- More immigrants, or those from other cultures, unaccustomed to Yellow Page use

- Area code proliferation fragments cities

- Larger cities have multiple directories, rather than one large one

- Development of specialized directories - like ethnic, non-English, women, minority, business to business

Become Visible Online - With or Without Your Own Web Site

If your business already has a Web site, treat it as a way to expand the reach of your Yellow Page ad and traditional marketing activities. Jettison the expectation that it should make sales - few do so. But an information-packed Web site can support your traditional marketing methods very well.

Even without your own Web site, your small business can establish an online identity that helps buyers to find you.

- Get listed in a variety of Internet Yellow Page (IYP) directories

- Send emails to your "regulars" with special offers and useful information

- Position yourself for Local Search - a method whereby customers use search engines to locate local businesses by town, state, region, zip code, etc.

Source: http://ezinearticles.com/?Combine-Your-Yellow-Page-Ad-and-Web-Site-for-Maximum-Profits&id=1816

Monday, 27 April 2015

Stand Apart From the Rest - Yellow Page Advertising and the Internet For Small Businesses

A lot of people use the Yellow Pages when looking for information about local businesses. That is why, one of the tried and tested small local business advertising tips is Yellow Page Advertising. Why? When a consumer references the Yellow Pages, it means he is ready to buy, will most likely call, and will buy at the place stated in the Yellow Page ad.

But a new trend is growing that can not be ignored: local searches in the Internet. Computer savvy consumers go to the Internet to look for products and services, with a high percentage of searches on specific localities. What about Yellow Page users? Studies conducted show that frequent Yellow Page users are likely to be Internet subscribers and heavy Internet users. And part of their online activities includes purchases via the Internet that sometimes amount to $1,000. A business that disregards online activities may do well to think again.

The convenience of online searching for local products and services is gaining acceptance. However, search results are not as comprehensive as the desired information for local businesses. Many small local businesses do not have online presence. Searching for specific services in a given locale is still done by referencing the yellow pages, as proven by the 13.4 billion Yellow Page references in 2006. After referencing the Yellow Pages, some consumers then turn to the Internet to check if the company has a website.

Who doesn't want a growing customer base? You can capture both Yellow Page users and Internet subscribers and users attention. Make online and offline methods of advertising work together. Make sure consumers find you whatever their method of searching is: online or offline. Cover all your bases. Yellow Page advertising for lawyers, doctors, and other professionals in a given locale may be effective. But having online presence in databases will give that edge over competitors in the local market. It increases your visibility and chances of being found by the people who need your services.

Put together search engine marketing, local search and yellow page advertising whatever your services are, e.g. marketing for personal injury lawyers, etc., will make you stand out not only in your locality, but also worldwide.

Marketing from the heart specializes in lawyer marketing and yellow page advertising for law firms.

Source: http://ezinearticles.com/?Stand-Apart-From-the-Rest---Yellow-Page-Advertising-and-the-Internet-For-Small-Businesses&id=2177150

Saturday, 18 April 2015

How to Generate Sales Leads Using Web Scraping Services

The first stage of any selling process is what is popularly known as “lead generation”. This phase is what most businesses place at the apex of their sales concerns. It is a driving force that governs decision-making at its highest levels, and influences business strategy and planning. If you are about to embark on an outbound sales campaign and are in the process of looking for leads, you would acknowledge the fact that lead generation process is of extreme importance for any business.

Different lead generation techniques have been used over and over again by companies around the world to satiate this growing business need. Newer, more innovative methods have also emerged to help marketers in this process. One such method of lead generation that is fast catching on, and is poised to play a big role for businesses in the coming years, is web scraping. With web scraping, you can easily get access to multiple relevant and highly customized leads – a perfect starting point for any marketing, promotional or sales campaign.

The prominence of Web Scraping in overall marketing strategy

At present, levels of competition have risen sky high for most businesses. For success, lead generation and gaining insight about customer behavior and preferences is an essential business requirement. Web scraping is the process of scraping or mining the internet for information. Different tools and techniques can be used to harvest information from multiple internet sources based on relevance, and the structured and organized in a way that makes sense to your business. Companies that provide web scraping services essentially use web scrapers to generate a targeted lead database that your company can then integrate into its marketing and sales strategies and plans.

The actual process of web scraping involves creating scraping scripts or algorithms which crawl the web for information based on certain preset parameters and options. The scraping process can be customized and tuned towards finding the kind of data that your business needs. The script can extract data from websites automatically, collate and put together a meaningful collection of leads for business development.

Lead Generation Basics

At a very high level, any person who has the resources and the intent to purchase your product or service qualifies as a lead. In the present scenario, you need to go far deeper than that. Marketers need to observe behavior patterns and purchasing trends to ensure that a particular person qualifies as a lead. If you have a group of people you are targeting, you need to decide who the viable leads will be, acquire their contact information and store it in a database for further action.

List buying used to be a popular way to get leads, but their efficacy has dwindled over time. Web scraping is the fast coming up as a feasible lead generation technique, allowing you to find highly focused and targeted leads in short amounts of time. All you need is a service provider that would carry out the data mining necessary for lead generation, and you end up with a list of actionable leads that you can try selling to.

How Web Scraping makes a substantial difference
With web scraping, you can extract valuable predictive information from websites. Web scraping facilitates high quality data collection and allows you to structure marketing and sales campaigns better. To drive sales and maximize revenue, you need strong, viable leads. To facilitate this, you need critical data which encompasses customer behavior, contact details, buying patterns and trends, willingness and ability to spend resources, and a myriad of other aspects critical to ascertain the potential of an entity as a rewarding lead. Data mining through web scraping can be a great way to get to these factors and identifying the leads that would make a difference for your business.

web-scraping-service

Crawling through many different web locales using different techniques, web scraping services pick up a wealth of information. This highly relevant and specialized information instantly provides your business with actionable leads. Furthermore, this exercise allows you to fine-tune your data management processes, make more accurate and reliable predictions and projections, arrive at more effective, strategic and marketing decisions and customize your workflow and business development to better suit the current market.

The Process and the Tools

Lead generation, being one of the most important processes for any business, can prove to be an expensive proposition if not handled strategically. Companies spend large amounts of their resources acquiring viable leads they can sell to. With web scraping, you can dramatically cut down the costs involved in lead generation and take your business forward with speed and efficiency. Here are some of the time-tested web scraping tools which can come in handy for lead generation –

•    Website download software – Used to copy entire websites to local storage. All website pages are downloaded and the hierarchy of navigation and internal links preserve. The stored pages can then be viewed and scoured for information at any later time.     Web scraper – Tools that crawl through bulk information on the internet, extracting specific, relevant data using a set of pre-defined parameters.

•    Data grabber – Sifts through websites and databases fast and extracts all the information, which can be sorted and classified later.

•    Text extractor – Can be used to scrape multiple websites or locations for acquiring text content from websites and web documents. It can mine data from a variety of text file formats and platforms.

With these tools, web scraping services scrape websites for lead generation and provide your business with a set of strong, actionable leads that can make a difference.

Covering all Bases

The strength of web scraping and web crawling lies in the fact that it covers all the necessary bases when it comes to lead generation. Data is harvested, structured, categorized and organized in such a way that businesses can easily use the data provided for their sales leads. As discussed earlier, cold and detached lists no longer provide you with enough actionable leads. You need to look at various factors and consider them during your lead generation efforts –

•    Contact details of the prospect

•    Purchasing power and purchasing history of the prospect

•    Past purchasing trends, willingness to purchase and history of buying preferences of the prospect

•    Social markers that are indicative of behavioral patterns

•    Commercial and business markers that are indicative of behavioral patterns

•    Transactional details

•    Other factors including age, gender, demography, social circles, language and interests

All these factors need to be taken into account and considered in detail if you have to ensure whether a lead is viable and actionable, or not. With web scraping you can get enough data about every single prospect, connect all the data collected with the help of onboarding, and ascertain with conviction whether a particular prospect will be viable for your business.

Let us take a look at how web scraping addresses these different factors –

1. Scraping website’s

During the scraping process, all websites where a particular prospect has some participation are crawled for data. Seemingly disjointed data can be made into a sensible unit by the use of onboarding- linking user activities with their online entities with the help of user IDs. Documents can be scanned for participation. E-commerce portals can be scanned to find comments and ratings a prospect might have delivered to certain products. Service providers’ websites can be scraped to find if the prospect has given a testimonial to any particular service. All these details can then be accumulated into a meaningful data collection that is indicative of the purchasing power and intent of the prospect, along with important data about buying preferences and tastes.

2. Social scraping

According to a study, most internet users spend upwards of two hours every day on social networks. Therefore, scraping social networks is a great way to explore prospects in detail. Initially, you can get important identification markers like names, addresses, contact numbers and email addresses. Further, social networks can also supply information about age, gender, demography and language choices. From this basic starting point, further details can be added by scraping social activity over long periods of time and looking for activities which indicate purchasing preferences, trends and interests. This exercise provides highly relevant and targeted information about prospects can be constructively used while designing sales campaigns.

Check out How to use Twitter data for your business

3. Transaction scraping

Through the scraping of transactions, you get a clear idea about the purchasing power of prospects. If you are looking for certain income groups or leads that invest in certain market sectors or during certain specific periods of time, transaction scraping is the best way to harvest meaningful information. This also helps you with competition analysis and provides you with pointers to fine-tune your marketing and sales strategies.

get-results-from-your-lead-generation-campaign

Using these varied lead generation techniques and finding the right balance and combination is key to securing the right leads for your business. Overall, signing up for web scraping services can be a make or break factor for your business going forward. With a steady supply of valuable leads, you can supercharge your sales, maximize returns and craft the perfect marketing maneuvers to take your business to an altogether new dimension.

Source: https://www.promptcloud.com/blog/how-to-generate-sales-leads-using-web-scraping-services/

Wednesday, 18 March 2015

Professional Web Scraping Process

Web scraping is usually regarded as data mining and knowledge discovery. It is the process of extracting useful data and relationships from any data sources. For instance the web pages, databases and search engines. It employs pattern matching and statistical techniques. It is important to note that web scraping does not borrow from other fields like machine learning, databases, data visualization and others but supports such fields.

Web scraping process is such a complex process that requires not only time but also people with expertise in the same field. This is because the internet is such a dynamic resource that changes every time. For instance the data you can extract from a certain website a month ago will not be the same one you will extract now. The changing of data in short period of time poses the difficult of relying to such data and therefore calls for web scraping process. The web scraping process should be performed regularly in order to obtain accurate data that can be relied upon.

It is important to understand that many areas of business, science and other environments use a large amount of data. This data needs to be meaningful and knowledge in its application. Web scraping sometimes may be overlooked, but in essence it can provide very useful information than the statistical methods can produce. The web scraping methods are vital as they give you more control over the data.

Usually the data found on the internet is noisy data. This implies of the advertisements and pop-ups. The data also found on the internet can be described as dynamic data, sparse data, static data, heterogeneity and so and so forth. Such problems occur in very large amounts and therefore call for web scraping professional companies to perform their job. With such problems it is important to realize that statistical methods would never succeed and therefore calls for web scraping.

Process of web scraping

1. Identification of data sources and selection of target data. You need not to harvest any kind of data, but data that is deemed relevant and useful in its application. The relevance can be seen in a way of getting the data that will benefit your company. This is an important step in the web scraping process.

2. Pre-process.This involves cleaning and attributes selection of data before it is being harvested. Web scraping is usually done on specific websites that are relevant to your business. For instance if you have an online store and need information about your competitors products then you need data from other websites that are relevant such e-commerce stores and so on.

3. Web scraping. This involves data mining so as to extract models and information patterns or models that is beneficial to your business.

4. Post-process. After web scraping is done, it is important to identify the useful data that can be used in your business in decision making and so on.

It is important to note that the patterns identified need to be novel, understandable, potentially viable and valid for web scraping process to make sense in business data harvesting.

Source:http://www.loginworks.com/blogs/web-scraping-blogs/professional-web-scraping-process/

Monday, 16 March 2015

6 Benefits Associated with Data Mining

Data has been used from time immemorial by various companies to manage their operations.Data is needed by various organizations strategically aimed at expanding their business operations, reduction of costs, improve their marketing force and above all improve profitability. Data mining is aimed at the creation of information assets and uses them to leverage their objectives.

In this article, we discuss some of the common questions asked about the data mining technology. Some of the questions we have addressed include:

•    How can we define data mining?
•    How can data mining affect my organization?
•    How can my business get started with data mining?

Data Mining Defined

Data mining can be regarded as a new concept in the enterprise decision support system, usually abbreviated as DSS. It does more than complementing and interlocking with the DSS capabilities that may involve reporting and query. It can also be used in on-line analytical processing (OLAP), traditional statistical analysis and data visualization. The technology comes up with tables, graphs and reports of the past business history.

We may define data mining as modeling of hidden patterns and discovering data from large volumes of data.It is important to note that data mining is very different from other retrospective technologies because it involves the creation of models. By using this technology, the user can discover patterns and use them to build models without even understanding what you are after. It gives explanation why the past events happened and even predicting what is likely to happen.

Some of the information technologies that can be linked to data mining include neural networks, fuzzy logic, rule induction and genetic algorithms. In this article we do not cover those technologies but focus on how data mining can be used to meet your business needs and you can translate the solutions thereafter into dollars.

Setting Your Business Solutions and Profits


One of the common questions asked about this technology is; what role can data mining play for my organization? At the start of this article we described some of the opportunities that can be associated with the use of data. Some of those benefits include cost reduction, business expansion, sales and marketing and profitability. In the following paragraphs we look into some of the situations where companies have used data mining to their advantage.

Business Expansion

Equity Financial Limited wanted to expand their customer base and also attract new customers. They used the Loan Check offer to meet their objectives. Initiating the loan, a customer had to go to any branch of Equity branch and just cash the loan. Equity introduced a $6000 LoanCheck by just mailing the promotion to their existing customers. The equity database was able to track about 400 characteristics of every customer. The characteristics were about loan history of the customer, their active credit cards, current balance on the credit cards and if they could respond to the loan offer. Equity used data mining to shift through 400 customer features and also finding the significant ones. They used the data and build model based on the response to the Loan Check offer. They then integrated this model to 500,000 potential customers from credit bureau. They then selectively mailed the most potential customers that were determined by the data mining model.At the end of the process they were able to generate a tot
al of $2.1M in extra net income from 15,000 new customers.

Reduction of Operating Costs

Empire is one of the largest insurance companies in the country. In order to compete with other insurance companies, it has to offer quality services and at the same time reducing costs.Therefore it has to attack costs that may in form of fraud and abuse. This demands a considerable investigation skills and use of data management technology. The latter calls for data mining application that can profile every physician in their network based on claims records of every patient in their data warehouse. The application is able to detect subtle deviations on the physician behavior that are linked to her/her peer group. The deviations are then reported to the intelligence and fraud investigators as “suspicion index.” With this effort derived from data mining, the company was able to save $31M, $37M, and $41M in the first three years respectively from frauds.

Sales Effectiveness and Profitability


In this case we look into pharmaceutical sector. Their sales representatives have wide range of assortment tools they use in promoting various products to physicians. Some of the tools include product samples, clinical literature, dinner meetings, golf outings, teleconferences and many more. Therefore getting to know the promotions methods that are ideal for particular physician is of valuable importance and it is likely to cost the company a lot of dollars in sales call and thereby more lost revenue.

Through data mining, a drug maker was able to link eight months of promotional activity based on corresponding sales found in their database. They then used this information to build a predictive model for each physician.The model revealed that for the six promotional alternatives, only three had a significant impact. Then they used the knowledge found in the data mining models and thereby customizing the ROI.

Looking at those two case studies, then ask yourself, was data mining necessary?

Getting Started


All the cases presented above have revealed how data mining was used to yield results to the various businesses. Some of the results led to increased revenue and increased customer base. Others can be regarded as bottom-line improvements that impacted on cost savings and also improved productivity.In the next few paragraphs we try to answer the question; how can my company get started and start realizing the benefits of data mining.

The right time to start your data mining project is now. With the emergence of specialized data mining companies, starting the process has been simplified and the costs greatly reduced. Data mining project can offer important insights into the field and also aggregate the idea of creating a data warehouse.

In this article we have addressed some of the common questions regarding data mining, what are the benefits associated with the process and how a company can get started. Now, with this knowledge your company should start with a pilot project and then continue building a data mining capability in your company; to improve profitability, market your products more effectively, expand your business and also reduce costs.

Source: http://www.loginworks.com/blogs/web-scraping-blogs/255-benefits-associated-with-data-mining/

Monday, 9 March 2015

Why Web Scraping Software Won't Help

How to get continuous stream of data from these websites without getting stopped? Scraping logic depends upon the HTML sent out by the web server on page requests, if anything changes in the output, its most likely going to break your scraper setup.

If you are running a website which depends upon getting continuous updated data from some websites, it can be dangerous to reply on just a software.

Some of the challenges you should think:

1. Web masters keep changing their websites to be more user friendly and look better, in turn it breaks the delicate scraper data extraction logic.

2. IP address block: If you continuously keep scraping from a website from your office, your IP is going to get blocked by the "security guards" one day.

3. Websites are increasingly using better ways to send data, Ajax, client side web service calls etc. Making it increasingly harder to scrap data off from these websites. Unless you are an expert in programing, you will not be able to get the data out.

4. Think of a situation, where your newly setup website has started flourishing and suddenly the dream data feed that you used to get stops. In today's society of abundant resources, your users will switch to a service which is still serving them fresh data.

Getting over these challenges

Let experts help you, people who have been in this business for a long time and have been serving clients day in and out. They run their own servers which are there just to do one job, extract data. IP blocking is no issue for them as they can switch servers in minutes and get the scraping exercise back on track. Try this service and you will see what I mean here.

Dheeraj Juneja, Founder & CEO, Loginworks Softwares, A Virtual IT Team for your business

Loginworks Softwares Web Scraping Service


Read more about various technical stuff at our blogs: Technical Blogs

Source: http://ezinearticles.com/?Why-Web-Scraping-Software-Wont-Help&id=4550594

Wednesday, 4 March 2015

Restaurant Internet Marketing - It's Not Your Father's Yellow Pages

As a restaurant owner or manager are you taking advantage of all the opportunities offered up by the Internet or are you still counting on word of mouth and a traditional print marketing program to get diners filling your place? Are you even aware of what the Internet marketing can do for you? If you're not you're giving up business to your competition.

Take a moment to Google your type of restaurant (Italian, seafood, deli etc) and the name of your town and see what comes up on the first page. Do you see your name? How many competitors do you see? Now ask yourself "How many potential diners are going to come to my restaurant based on these results"? The answer of course is nada. Can you afford to give that business to your competition?

Location, location, location
Many local small and medium sized businesses have discovered the value of having an internet presence but it wasn't always that way. Most local businesses thought the internet best served big companies. On top of that, most small local businesses simply didn't have a clue how to participate effectively and they were too busy running their businesses to learn how.

The fact of the matter is that 72% of all searches are related to a search for local content. Nearly a year ago the monster search engine Google changed their algorithms so that local results show up on a search inquiry whenever appropriate. That means your shop can be just as competitive as a national restaurant chain when it comes to being found in search.

The difference between search and traditional advertising

So why is search such a big deal? The reason it is so much more effective than traditional marketing is that it responds to a specific need at a time when the potential diner is interested in the information. Unlike a weekly flier that's dropped in the home mailbox every Thursday, search doesn't clutter up a prospective diner's life but rater provides relevant information when the diner wants it not when the USPS delivers it.

When that potential customer is thinking about where to go for lunch he or she is going to search "pizza san pedro, ca" to find what's available in the area. In other words search delivers the information when the searcher is most receptive to it.

But wait there's more

Search by itself is reason to have a presence on the internet but the benefits certainly don't end there.

How would you like to be able to:
- Notify all of your customers what your daily specials are every day automatically.
- Invite your customers to place their order in advance so they don't have to wait for their food and do it without spending any time on the phone.
- Accept and confirm reservations electronically.
- Run promotional campaigns without spending a dime on advertising.
- Build solid customer relationships that will strengthen your word of mouth referrals.

And that's just the beginning. A professional internet presence will cost you significantly less than you are currently paying today for marketing. What's more, professionally managed internet presence keeps on giving even when you stop paying. Can you say that about your Yellow Page listing or print ads?
Online Reputation Management

There can be a potential dark side to this whole internet thing. You live or die on your reputation and it is important that you manage that reputation online. Never has the consumer had such an easy way to post anonymous comments for all the world to see. If you have a customer who has a bad experience don't be surprised if he or she posts it to a blog or a forum or gives you a bad review in the business directories.

Speaking of business directories, you already are on the internet even if you didn't know it. These directories scrape information out of Yellow Pages and other sources and add it to a business profile. You have a right to claim those listings and they are actually a great way to promote the business. But registered visitors also have the right to post reviews and if the only review your store has is from that disgruntled customer...your rep is shot.

There are just too many reasons that you have to get involved with the internet marketing. To not do so is to pass up an incredible opportunity for new business and to cement loyalty with regular customers. It's effective, inexpensive and essential if you want to remain competitive.

WebFrootz is an internet marketing agency, Baltimore website design, and Baltimore web development company MD specializing in comprehensive marketing solutions for small and medium business including branding, Website design and development, graphic design, copywriting, and SEO.

Source: http://ezinearticles.com/?Restaurant-Internet-Marketing---Its-Not-Your-Fathers-Yellow-Pages&id=6498349

Thursday, 26 February 2015

What Is ISL Uranium Mining

In situ leach mining (ISL), also known as in-situ mining or solution mining, was first used as a means to extract low grades of uranium from ore in underground mines. First used in Wyoming in the 1950s, originally as a low production experiment at the Lucky June mine, it became a high-production, low cost method of fulfilling Atomic Energy Commission uranium requirements at Utah Construction Company's Shirley Basin mining operations in the 1960s. Pioneered through the efforts of Charles Don Snow, a uranium mining and exploration geologist employed by Utah, many of his developments are still used today in ISL mining.

What is ISL mining? According to the Wyoming Mining Association website, ISL mining is explained in the following manner. (We choose Wyoming because it is the birthplace of "solution mining" as it was originally called.)

"In-situ mining is a noninvasive, environmentally friendly mining process involving minimal surface disturbance which extracts uranium from porous sandstone aquifers by reversing the natural processes which deposited the uranium.

To be mined in situ, the uranium deposit must occur in permeable sandstone aquifers. These sandstone aquifers provide the "plumbing system" for both the original emplacement and the recovery of the uranium. The uranium was emplaced by weakly oxidizing ground water which moved through the plumbing systems of the geologic formation. To effectively extract uranium deposited from ground water, a company must first thoroughly define this plumbing system and then designs well fields that best fit the natural hydro-geological conditions.

Detailed mapping techniques, using geophysical data from standard logging tools, have been developed by uranium companies. These innovative mapping methods define the geologic controls of the original solutions, so that these same routes can be retraced for effective in situ leaching of the ore. Once the geometry of the ore bodies is known, the locations of injection and recovery wells are planned to effectively contact the uranium. This technique has been used in several thousand wells covering hundreds of acres.

Following the installation of the well field, a leaching solution (or lixiviant), consisting of native ground water containing dissolved oxygen and carbon dioxide, is delivered to the uranium-bearing strata through the injection wells. Once in contact with the mineralization, the lixiviant oxidizes the uranium minerals, which allows the uranium to dissolve in the ground water. Production wells, located between the injection wells, intercept the pregnant lixiviant and pump it to the surface. A centralized ion-exchange facility extracts the uranium from the barren lixiviant, stripped of uranium, is regenerated with oxygen and carbon dioxide and recirculated for continued leaching. The ion exchange resin, which becomes 'loaded' with uranium, it is stripped or eluted. Once eluted, the ion exchange resin is returned to the well field facility.

During the mining process, slightly more water is produced from the ore-bearing formation than is reinjected. This net withdrawal, or 'bleed,' produces a cone of depression in the mining area, controlling fluid flow and confining it to the mining zone. The mined aquifer is surrounded, both laterally and above and below, by monitor wells which are frequently sampled to ensure that all mining fluids are retained within the mining zone. The 'bleed' also provides a chemical bleed on the aquifer to limit the buildup of species like sulfate and chloride which are affected by the leaching process. The 'bleed' water is treated for removal of uranium and radium. This treated water is then disposed of through waste water land application, or irrigation. A very small volume of radioactive sludge results; this sludge is disposed of at an NRC licensed uranium tailings facility.

The ion exchange resin is stripped of its uranium, and the resulting rich eluate is precipitated to produce a yellow cake slurry. This slurry is dewatered and dried to a final drummed uranium concentrate.

At the conclusion of the leaching process in a well field area, the same injection and production wells and surface facilities are used for restoration of the affected ground water. Ground water restoration is accomplished in three ways. First, the water in the leach zone is removed by "ground water sweep", and native ground water flows in to replace the removed contaminated water. The water which is removed is again treated to remove radionuclides and disposed of in irrigation. Second, the water which is removed is processed to purify it, typically with reverse osmosis, and the pure water is injected into the affected aquifer. This reinjection of very pure water results in a large increment of water quality improvement in a short time period. Third, the soluble metal ions which resulted from the oxidation of the ore zone are chemically immobilized by injecting a reducing chemical into the ore zone, immobilizing these constituents in situ. Ground water restoration is continued until the affected water is suit
able for its pre-mining use.

Throughout the leaching and restoration processes, a company ensures the isolation of the leach zone by careful well placement and construction. The well fields are extensively monitored to prevent the contamination of other aquifers.

Once mining is complete, the aquifer is restored by pumping fresh water through the aquifer until the ground water meets the pre-mining use.

In situ mining has several advantages over conventional mining. First, the environmental impact is minimal, as the affected water is restored at the conclusion of mining. Second, it is lower cost, allowing Wyoming's low grade deposits to compete globally with the very high grade deposits of Canada. Finally the method is safe and proven, resulting in minimal employee exposure to health risks."

ISL mining may be the wave of the future of U.S. uranium mining, or it may become an interim mining measure, in areas where the geology is appropriate for IS. Until sufficient quantities of uranium are required by U.S. utilities to fuel the country's demand for nuclear energy, ISL mining may remain the leading uranium mining method in the United States. At some point, an overwhelming need for uranium for the nuclear fuel cycle may again put ISL mining in the backseat, and uranium miners may return to conventional mining methods, such as open pit mining.

Source: http://ezinearticles.com/?What-Is-ISL-Uranium-Mining&id=183880

Sunday, 22 February 2015

Ancient Basic Tools to Green Light Laser: The Evolution of Mining

Mining is the process of extracting minerals and geological materials from the earth. Miners help recover many elements. These materials are rare as they are not grown, agriculturally processed or artificially created. Precious metals, coal, diamonds, and gold are just some of these materials. Mining also helps man to unearth non-renewable energy source like natural gas, petroleum, and even water. The job of miners can be difficult and risky. Thanks to efficient mining equipment, the task is a lot easier now.

People of the ancient time made use of the earth for many purposes. One way to make a living at the time is by mining. Equipment were not fully developed but people managed to unearth many precious stones and different kinds of metals. They use these minerals and elements in making basic tools for hunting and warfare. High quality flints found in masses of sedimentary rocks were in-demand in many parts of Europe. People used these flints as weapons during the Stone Age.

Ancient Egyptians were among the first to successfully get minerals from earth. Their advanced level of civilization made it possible for them to produce quality mining tools. They mined malachite and gold. Malachites are green stones used for pottery and as ornaments. The Egyptians started to quarry for other minerals not found in their soils. They head to Nubia, a part of Africa. There they used iron tools as mining equipment. That was the time when fire-setting was used to extract gold from ores. This method involves setting the rock containing the mineral against another rock, heat it and douse it with water. This was the most effective mining method that time.

The Romans also played an important part in the history of mining. They were the first to use large scale quarrying methods. An example of this is the application of volumes of water to operate simple machinery and remove debris. This is the birth of hydraulic mining.

The demand for metal increased dramatically in the 1300s. This was the time when swords, armors, and other weapons were in-demand. For this reason, miners looked for more sources of iron and silver. There was also an increase in the demand for coins that caused shortage of silver. Iron, on the other hand, was utilized in building constructions. With the high value of these materials, machineries and other mining equipment became in demand in the market.

These machines and equipment were the mothers of the present mining tools that we have today. Miners today use bulldozers, explosives and trucks. More advanced form of mining tools includes the use of green light laser serving as saw guides and machine alignment. With all these modern equipment, miners now have a safer and faster process to break down rocks and even carve out mountains. All these materials are produced and applied with the principles of engineering.

As of today, there are five major mining categories. They are coal, metal ore, non-metallic mineral mining, oil and gas extraction. Oil and gas extraction is among the biggest industries in the world today.

Source: http://ezinearticles.com/?Ancient-Basic-Tools-to-Green-Light-Laser:-The-Evolution-of-Mining&id=6768619

Thursday, 19 February 2015

Online Retail - Mining for Gold

Online retailers live in an ever-changing environment, and the ability to stay competitive is the difference between doing well and doing nothing. In today's fast paced internet market place, if you aren't using web scraping, you are missing a key component to growing your business.

Data Mining

Data mining your competition's prices and services and making sure your prices and services are similar, or even lower, is what makes the difference. Why should your customer choose you if they can get the same product somewhere else for less? What data you collect and how often you update it is also another key ingredient to success.

Extract Website Data

Web scraping allows you to gather information from your competition and use it improve your position in the market. When you extract website data from your competitor's website, it allows you to conduct business from a position that doesn't involve guess work. The internet is an environment that is constantly being updated and changed. It is vital that you have the ability to have up-to-date information on what others in your market are doing. If you can't do this, you really can't compete.

Application of Information

When you know what your competitors are doing all the time, you can keep your business a little more competitive than they are. When you have information such as monthly and even weekly price variations in the market and what products and services are being offered, you can apply that information to your own pricing matrix and ensure a competitive edge in your market.

An Army of One

Web scraping gives you the ability to see what is going on in the market at all times. You can monitor just about anything you choose with a web scraping service. Many online retailers are very small operations and they don't have the resources to constantly monitor each competitor's website - so engaging a web scraping service is like having your own marketing and research team working for you night and day to keep tabs on them. You choose what it is you want to know, and your research team goes to work. Simple.

Staying Ahead of Trends

Having the ability to recognize trends is the key to any business, especially on the internet were information is so fluid. The business that can identify a trend quickly and take advantage of it will always stay one step ahead. That's why big corporations have teams dedicated to researching market trends and predictions. If you can see where something is going, you can always get ahead of it. That's what web scraping can help you do - identify those trends in your market so you can get in ahead of the pack.

A Helping Hand

Sometimes running your own online retail business can be a daunting and lonely ordeal. Even those that have a great deal of experience with the internet can feel lost at times. A web scraping service is a tool you can use to help yourself in such times. Web scraping is automated and precise, and it gives you the ability to have vital information delivered to you in a manner you can understand and use. It's one less thing to worry about - and the information you get from data mining is what every business owner actually should worry about - what the competition is doing? With a web scraping service, you can concern yourself with other things - like making more profits.

Source: http://ezinearticles.com/?Online-Retail---Mining-for-Gold&id=6531024

Wednesday, 18 February 2015

Commercial Kitchen Ventilation and Extraction - What You Need to Know

There are a number of things to consider when installing commercial kitchen ventilation and there are several different types of systems available - but all must comply with the "Standard for kitchen ventilation systems DW172". A commercial kitchen cannot operate effectively without a properly designed and functioning ventilation system. Getting the design of the correct system for YOUR premises can be complex. All systems are operation and site specific - how you move the air, where you move it to and what you have to do with it to ensure compliance not only with the relevant legislation, but also any local building and environmental constraints.

The factors that may need to be addressed include not only physically moving the air, but heat, humidity, smoke, fire, grease and odour. There are various filter and safety systems available that deal with any or all of these issues and the best system for you will depend on your site, its surroundings and your budget. You may also have to deal with noise from the fan(s) and any planning issues relating to external ducting.

In basic terms a ventilation system comprises a canopy over the production area with a fan linked by ducting to a filter bank within the kitchen extraction canopy which draws the air out to the external exhaust point. The fan is sized in direct relation to the amount of air that has to be moved, where it has to be moved to (the exhaust point) and how quickly (depending on the type of food being cooked).

In addition, mechanical provision must be made to replace 85% of the air that is being extracted. This is called "Make up Air", the other 15% is made up by natural means - general kitchen areas and windows etc.

Within the design, careful consideration must also be given to ensure adequate access for cleaning of the duct and servicing of the fans.

If the production equipment is gas, in accordance with British Standard (BS6173) you will have to fit a Gas Interlock system. This system automatically shuts off the gas supply to the cooking equipment in the event of a failure in the ventilation system.

You may also want to consider the installation of a Heat Recovery unit which reclaims the heat (and some of the fuel cost) from your kitchen that would normally be blasted straight out through from your extracton canopy.

Source:http://ezinearticles.com/?Commercial-Kitchen-Ventilation-and-Extraction---What-You-Need-to-Know&id=6438003

Friday, 13 February 2015

Websites Can Contractually Restrict Third Party Scraping of Their Data

E-commerce service providers can contractually prevent other websites from copying factual information from their website for commercial use, such as for price comparison purposes.

On 15 January 2015, the Court of Justice of the European Union (CJEU) confirmed in a preliminary ruling that websites not protected by a database right, are free to impose contractual restrictions on the use of their data. Interestingly, the CJEU acknowledged that the contractual restrictions could – if national law permits - be imposed through the website’s terms and conditions.

Let’s have a quick look at how this matter arose. Since the early days of online reservations, some websites discovered that they could attract a lot of visitors by comparing the online prices displayed by e-commerce websites selling competing goods and services. Originally such third party websites were called “content aggregators” and today one particular type, so-called “price comparison” websites, is widely-known.  To be able to aggregate such content and create added-value for the consumer, these websites use automated software that visits the e-commerce websites and copies the latter’s pricing information in real time. This practice is often referred to as “screen scraping” and frequently occurs in the online travel reservation business. Some of these third party websites do not only show the compared prices of airline tickets but act as an intermediary for booking travel packages, including car and hotel rental services on top of the airline ticket, often after adding a commission.

In response, low-cost airlines quickly started taking legal action against such screen scraping practices, fearing the loss of such additional, revenue-generating services to these third party websites and also through suffering reputational damage when consumers were not properly informed about issues such as flight changes and cancellations. In these circumstances there was one case between the low-cost airline, Ryanair, and the third party website owner, PR Aviation BV, in which the Dutch Supreme Court made a preliminary ruling request to the CJEU.

The CJEU, in its preliminary ruling on the scope of database protection and contractual freedom, ruled in Ryanair’s favour. It concluded that, in the absence of any database related copyright or sui generis protection on Ryanair’s website, Ryanair was expressly allowed to lay down contractual limitations on the use of its website by third parties. Ryanair would not have had such contractual freedom if its database enjoyed copyright or sui generis database protection (due to the restriction laid down in Article 15 of the Database Directive 96/9/EC). Ryanair’s terms and conditions, to which users had to visibly agree when searching for flights (but without needing to explicitly tick a box), indeed stated that the use of any automated system or software to extract data from its website for commercial purposes was prohibited. Ryanair even went as far as to explicitly state that other websites could not sell its flights and that price comparison websites had to enter into a written licence agreement with Ryanair,
to access Ryanair’s price, flight and timetable information for the sole purpose of price comparison.

As a consequence of the CJEU’s ruling, any website making available mere factual information not protected by any legal right, can still prevent others from using such information through its terms and conditions. Clearly, that website will have to demonstrate under applicable (national) law that the website visitor is contractually bound, in particular because it validly agreed to such terms and conditions. Depending on the applicable law, such agreement by the consumer could be considered as having taken place by ticking a box or merely after having been made aware of the website’s terms and conditions.

The CJEU’s ruling is likely to impact upon the business model of a number of content aggregating/price comparison websites. The ruling’s concrete relevance, however, will have to be assessed on a case-by-case basis.

Source:http://www.timelex.eu/en/blog/detail/websites-can-contractually-restrict-third-party-scraping-of-their-data

Monday, 9 February 2015

Top Tips for Data Mining Success

You may have trieTips for data mining successd data mining before but you seem to be lost in the maze of confusion, data overload, and a number of strange terms and icons. Do not fret, you are not alone. There may be a number of first timers who are in the same boat as you do. Stop, refocus and start all over again with the following tips in mind.

It is important that proper handling of the data mining procedure must be employed. Easy as it may sound, it can only bring in great results when it is placed in the expert hands and when done according to the right patterns and processes. This is not to say that data mining is only successful for a gifted and trained few. It means serious consideration, preparation, and training must be part of the groundwork before disembarking into it.

The most practical and tested tips are: know your desired outcomes; set expectations; assign the right personnel; avoid data dump; create a deployment scheme; develop a maintenance plan.

Know your desired outcomes

As the major proprietor of your business, you of all people should have a clear view in mind of what you really want for your business. Thus, before trying on new strategies and techniques that are recommended to you, you must know what your desired outcomes are. For instance, if your business is in real estate, you must be able to foresee which direction your market should go. Are you going up on skyscrapers or towards the horizons in the countryside? From great lengths, you go to the specifics and clearly spell out what you want and where it should be.

Set expectations

In connection with identifying your outcomes, you must also set realistic and attainable expectations. These are the very things that preclude possible obstacles and frustrations in the coming years. You can see where your business is going by web research or data mining. You can see the past and present of your competitors and you can also set your own future based on the experiences of others. It is often wise to set expectations that you have not attained before. It is like plowing and preparing the ground because you know rain is coming and it is the right time to plant and gain great harvest.

Assign the right personnel

When you find the right person as well as the right data mining service, you can cut short tiresome planning, devising and preparation. If you are in a small enterprise, you can spearhead the procedure but if you have enough staff at your disposal, choose one who is not only knowledgeable but also reliable and dedicated. You do not want someone who is only a good starter and one who would leave you hanging when the going gets tough.

Avoid data dump

Being sure of what you want can help you avoid unnecessary data. Data mining like real mining is being able to know where the gold is and is able to get it done in the most efficient and effective way. Being able to identify the legal sites and reliable, well researched information is the short cut to finding the right and exact data. It would be a waste of time and effort if you are aimlessly opening and clicking on unsure and ambiguous websites. There are a lot of links that lead you to more links and are simply making money out of others’ ignorance.

Create a deployment scheme

Like any other venture, you must also be able to delegate the task as well as the information that you gather. Since you are not a superhuman, learn to seek the assistance of others and be sure that you know who to trust. In addition, you must have a classification and segregation of the needed materials so that these will be easy to locate and analyze. In other words, order and proper organization is another tip in order to achieve success in data mining.

Develop a maintenance plan

Finally, along with orderliness and efficiency, you must see to it that you have an effective maintenance plan. What to do with old data and where to store the vital ones are concerns that need to be considered too. In addition, there is a need for a watchdog in the whole duration of your business venture. This will not only assure you of security of your data but also keep you on healthy and solid ground. This maintenance can be both a cleaning and healing spot for your business’ overall life and sustainability.

So much can be said about how to go about with your business using data mining but there is a factor that is uniquely your own. Above and beyond all these techniques and strategies, trust your instincts. You are the better judge of your desires and actions; thus, you must spend time alone in reflection, contemplation and retrospection. Being silent and alone can make you see things that are missed among all the movements and noise. Once in a while, leave the scene and look objectively at your work. Remember, there is wisdom in alienation and objectivity.

Source: http://www.loginworks.com/blogs/web-scraping-blogs/213-tips-for-data-mining-success/

Monday, 26 January 2015

Catching Online Content Scrapers

Content scrapers are all over the Internet. They steal your content and use them for their own blogs without your permission. Some scrapers merely copy the content from your blog but many take content and present it as new.

It is very disconcerting to see your content appear, word for word, on someone else's website and you know that you had absolutely nothing to do with that (aside from actually writing the content) and you certainly did not give your permission to anyone to use your content without the proper (or any) attribution for you. On the other hand, however, if a person doesn't change your article and gives you credit and links back to your original article, that is okay.

Catching content scrapers in the act

Most likely, you don't even know where to begin when it comes to figuring out exactly who is stealing your content. There are several websites that will help you to reveal exactly who is doing you wrong.

Copyscape: Copyscape is a search engine in which you can put the full URL of where your content lives and it will let you know if and where there are duplicates. Copyscape has a search function that won't cost you anything. If you prefer their premium service, it will allow you to check up to 10,000 pages.

WordPress trackbacks: You can see when someone includes your content in their blogs. If they don't change the article and give you the credit and link to the original article, that is fine. This is not scraping. If the person puts their noame on your article, it can be considered plagiarism.

Webmaster Tools: If you go to Webmaster Tools, click on "Look Under Your site on the Web" and then click on "Links to Your Site," columns will appear with linked pages. From this, you can see that websites that aren't social media websites, social bookmarking websites or loyal fans and that link to a large number of your posts is very possibly a content scraper. If you want to verify this, you should go to those particular websites. In order to do that, you should click on any of the domains to be able to see the details of specifically which pages on your websites they are connecting with.

Using Google Alerts: If you don't happen to post a high volume of content and you aren't interested in paying attention to who and how many times your business is mentioned, you can create a Google Alert that matches the titles of your posts verbatim. You do this by putting quotation marks around the titles. You can set it up so that they come to you automatically every day.

Once you have established that your content is being scraped: Once you have figured out that your content is being scraped, you can get credit for your posts that have been scraped. If you use WordPress, you can try the RSS footer plugin, which will let you put your text (or at least a portion of it) at the top or bottom of the RSS feed. An attribution line will appear with your title, you as the author and a list of social media channels where people can connect with you. This is an excellent way to counteract the fact that your content is being stolen and still get something for your business. That scenario is a lot better than you just being a sitting duck and scrapers coming along and taking from you whatever they wish to take.

Putting a stop to content scrapers

If other people stealing (or scraping) your content is abhorrent to you, there are a few effective things that you can do to combat it. The first thing that you can do is to communicate with the website that is stealing from you and basically give them a cease and desist order. You can communicate through a contact form on their website, if they have one or you can send an Email, if there is an available Email address. If there is no contact form on the website, you can go to Whois Lookup and find out who owns that particular domain. If you find that it isn't registered privately, you should at least be able to find an Email address of the administrator. There are ways of finding out the information that you need in order to make contact. Another thing that you can do is to visit the DMCA and click on their "takedown services," which will allow you to eliminate anyone whom is stealing your content.

Conclusion

Content scraping is highly unethical but it is done all of the time. Not everyone is as adept as others at producing large quantities of content. That is when the content scrapers get creative. If they aren't capable of writing the content themselves, they will just take what they want from other people. As a genuine, hardworking content writer, you have a right to protect yourself and your business's interests. You fight back in whatever way you feel you must. Content scraping is very easy to do but it isn't about it being easy. It is about doing the right thing. There are many available tools to help you determine if your content is being stolen. It behooves you to make full use of them.

We are pleased to provide you with the insightful comments contained herein. For a free assessment of your online presence, let's have coffee.

Carolyn T. Cohn is the Chief Editor of CompuKol Communications. Mrs. Cohn has a wealth of experience in managing people and projects. She has run several editorial departments for various companies. Mrs. Cohn has 25 years of editorial experience and her expertise covers a wide range of media, such as online editing, editing books, journal articles, abstracts, and promotional and educational materials. Throughout her career, Mrs. Cohn has established and maintained strong relationships with professionals from a wide range of companies. The principle that governs her work is that all words need to be edited.

Source: http://ezinearticles.com/?Catching-Online-Content-Scrapers&id=7747976

Wednesday, 21 January 2015

How to Take Advantage of Content Scrapers

This is our approach of dealing with content scrapers, and it turns out quite well. It helps our SEO as well as help us make extra bucks. Majority of the scrapers use your RSS Feed to steal your content. So these are some of the things that you can do:

•    Internal Linking – You need to interlink the CRAP out of your posts. With the Internal Linking Feature in WordPress 3.1, it is now easier than ever. When you have internal links in your article, it helps you increase pageviews and reduce bounce rate on your own site. Secondly, it gets you backlinks from the people who are stealing your content. Lastly, it allows you to steal their audience. If you are a talented blogger, then you understand the art of internal linking. You have to place your links on interesting keywords. Make it tempting for the user to click it. If you do that, then the scraper’s audience will too click on it. Just like that, you took a visitor from their site and brought them back to where they should have been in the first place.

•    Auto Link Keywords with Affiliate Links – There are few plugins like Ninja Affiliate and SEO Smart Links that will automatically replace assigned keywords with affiliate links. For example: HostGator, StudioPress, MaxCDN, Gravity Forms << These all will be auto-replaced with affiliate links when this post goes live.

•    Get Creative with RSS Footer – You can either use the RSS Footer or WordPress SEO by Yoast Plugin to add custom items to your RSS Footer. You can add just about anything you want here. We know some people who like to promote their own products to their RSS readers. So they will add banners. Guess what, now those banners will appear on these scraper’s website as well. In our case, we always add a little disclaimer at the bottom of our posts in our RSS feeds. It simply reads like “How to Put Your WordPress Site in Read Only State for Site Migrations and Maintenance is a post from: WPBeginner which is not allowed to be copied on other sites.” By doing this, we get a backlink to the original article from scraper’s site which lets google and other search engines know we are authority. It also lets their users know that the site is stealing our content. If you are good with codes, then you can totally get nuts. Such as adding related posts just for your RSS readers, and bunch of other stuff. Check out our guide t
o completely manipulating your WordPress RSS feed.

Source:http://www.wpbeginner.com/beginners-guide/beginners-guide-to-preventing-blog-content-scraping-in-wordpress/