Entries Tagged 'Black Hat Seo' ↓
June 29th, 2007 — Black Hat Seo, Google, Google Bowling, Link Spamming
It’s funny how theres this small circle of SEO bloggers where everyone knows each other but whenever the mainstream media scoops in that’s something noteworthy.
The gem of the note is that Matt Cutts finally admits that Google Bowling exists, or as he prefers to call it, Search Engine Bowling.
June 27th, 2007 — Black Hat Programming, Black Hat Seo, Black Hat Seo Tools, Footprints
So I’m chatting with Alex, the guy I talked about here, about his synonymizer tool:
[12:36] alexf: Even better, if settings for the rewrite of each article could be randomly variated (between certain ranges) to avoid having all the articles rewritten with the same parameters.
[12:36] alexf: but why you need this?
[12:37] alexf: if you set good settings any text will be rewritten different
[12:37] Q2hl: i dont know, im a fan of random
[12:37] alexf: even if you trying to rewrite same article twice – second copy will be different form first
[12:38] Q2hl: yes but dont you think rewriting all articles with same amount of nouns, adje, etc. will end up sharing some common pattern
[12:38] alexf: no
[12:38] Q2hl: in the sentence structure
[12:39] alexf: no, not at all
[12:39] alexf: first of all it depends on the source of the articles, if they are from different sources, synonymizer can’t make them look more like each other
[12:40] alexf: second – percentage of “mutation” only affects the quality of mutation
[12:41] alexf: the less percent – the better quality, article looks more like original, no silly mistakes
[12:42] alexf: but to make article look different from source, you need to put high % of mutation
[12:45] Q2hl: so for example
[12:46] Q2hl: if you had settings x y and z settings all the articles would have some degree of difference with original articles
[12:47] Q2hl: if youhad settings x-3 y+5 and z-6 the degree of difference with the original article would be different
[12:48] Q2hl: meaning if your settings vary from article to article, your articles would have some variation in respects to legibility
[12:49] alexf: this is not necessary to make article look different
[12:49] alexf: even with same x y z it still will look very different
[12:50] alexf: because synonyms itself will be different each time
[12:51] Q2hl: I can’t help thinking that randomizing these settings would add another layer of differentiation, but Im just stubborn
[12:51] Q2hl: do you mind if I post this conversation in the blog?
[12:52] alexf: sure, no problem
Why do I post this conversation? Not because I am trying to make Alex include a feature that he considers unnecessary. Truth be said, I trust his judgement more than mine.
I just thought the discussion was bringing up interesting points. This is what he says;
The quick brown fox skillfully jumps over the lazy dog
- nouns: fox, dog
- verbs: jumps
- adverbs: skillfully
- adjectives: quick, brown, lazy
Now let’s change 100% on each variable:
Output 1: The fast red wolf rapidly dances over the sleepy can
Now let’s change 100% on each variable except for the adjectives where we’ll change 33%:
Output 2: The speedy brown four legged animal rightly runs over the lazy pet
Now here’s the question we were talking about. Output 1 is 100% different from the original sentence but in Output 2 we have except for:
- Output 1 is more unique and less readable.
- Output 2 is more readable and less unique.
Deciding the amount of readability vs. uniqueness that you are going to give to your content is one of the basic decisions you need to make when planning your Black Hat Strategies.
At a first glance I thought being able to randomize this decision upon the rewriting of individual articles was a good idea. Now I can see Alex’s point and agree that is not necessary. I can’t really see a benefit to having one batch of sites with some pages filled with less readable and more unique content, and some pages with the opposite equation.
I just thought the whole thing was worth a post because it makes us think about this very basic element of content rewriting. So the million dollar question is:
Do you want unique, non readable content? Or do you want more less unique, less readable content?
I think that debate deserves a whole new post, or maybe even series of related posts.
June 27th, 2007 — Black Hat Programming, Black Hat Seo, Black Hat Seo Tools
Alex offers a free synonymizer. Don’t get intimidated by the russian characters, when you sign up you have the option to switch to english.
Alex is one very smart gentleman who also wears a black hat and offers a range of Black Hat SEO Tools specifically designed to generate doorway websites using computer generatd content that will pass dupe filters.
His synonymizer has some features I haven’t seen before: you can define the percentage of nouns, verbs, adverbs and adjectives that you want to replace. You can also define the minimum amount of characters words you want to have replaced. It’s not meant for manually rewriting, but for quick and dirty automatic rewriting it works very well.
I tried it and it’s spitting out results that look about 90% different from each other on each spin.
- It would be nice to be able to import a bunch of articles and have them all automatically rewritten.
- Even better, if settings for the rewrite of each article could be randomly variated (between certain ranges) to avoid having all the articles rewritten with the same parameters.
Well done again buddy
June 27th, 2007 — Black Hat Seo, Doorways, Google
Straight from Treadwatch, noticed by one of the most insightful SEO guys in the industry:
Wow Google what a great algo you’ve got and what a really terrific user experience you’re providing when you occupy 9 out of the top 10 results for a query with your own sites.
It’s funny how the Big G says one thing and does the opposite.
I tend to agree with Graywolf when he claims, in the thread discussion, the following:
[...] but it’s the tip of the iceberg used to make a point. Google engineers do site review clinics at conferences and scold people who have lot’s of domains with very similar if not identical content. However in practice they do exactly they tell everyone else not to do. They publish guidelines about how not providing original content creates a bad user experience, but again do exactly what they tell everyone else not to do. They keep preaching it’s all about the user experience and providing great results yet for a search term that they describe as “volcanic” in volume they provide almost no value.
Have you ever seen what happens in the serps when you create a piece of linkbait on a fresh domain? All those pages from Netscape, Digg, Delicious and Reddit are likely to rank higher than your own domain. Even though they are using duplicate content.
Congratulations assholes, you managed to consolidate an algorithm that disfavours the small guys who play fair. And then you whine if the amount of Black Hat SEOs is on the rise.
June 26th, 2007 — Black Hat Programming, Black Hat Seo, Footprints, From 0 to 100 a day - Case Study 1
It took longer than I wanted. But it is totally worth it to take your time and create your own version of a publicly available script. I now have my own content sources and I’m pulling them to YACG pages through a hook that a kick ass programmer I know put together. It only took him a couple of hours, but it is going to make a huge difference setting my sites apart from the rest of the YACG bunch. (BTW dude get a blog up so I can link to you )
If you want something similar done to your version of YACG, RSSGM or MyGen, or any generator you are using, drop a comment or get in touch with me. I can hook you up with him and you might be able to convince him to code something for you.
Anyways, I’m almost set for action. I just have to look at what domains and hosting I have available and go build some shit.
June 20th, 2007 — Black Hat Seo, Deindexing, Footprints, From 0 to 100 a day - Case Study 1, Indexing, Link Spamming, Templates, The Black Hat Monkeys Community, YACG
Here’s the short version of the post below: remove your fucking footprints.
Glad we got that out of the way Now for those with nothing better to do. Keep reading:
So you get off your wheels and you grab a copy of YACG (after signing up for their private forum.) You look so cool. You grab a list of a few hundred keyword phrases and you ‘re done. Sit back, plug your link spammer of choice and watch money rolling in.
Actually, wrong. But thank you for playing. You are now part of the Black Hat Monkeys Community.
You need to remove footprints.
Ah yes, I hear your smart brains going, I need to create my own template. So off you go, you get off your wheels again, still looking cool. Create your own template, plug it into RSSGM, MyGen or YACG, chuck a list of a few hundred keywords, etc., etc.
Now you DO sit back and watch the money roll in.
For crying out loud: are you freaking insane? How can you expect your site to last more than 24 hours in Google when you are using non markoved wiki scraped content, just like the other 99% of members of the Black Hat Monkeys Community. Go read a book, get drunk, run a marathon or two, and see if you come up with any ideas. Learn all about RSS, Content Translation, Synonym Replacement, Markov Filters, Syntax Rules and other Language Mutation Techniques.
Think for yourself. Read this post. Think about it. Now read this one. Now you go and read this blog for 6 months and then come back. You may be a couple of steps closer to becoming a successful Black Hat SEO.
If you are not a coder, it’s OK to LOPC (leverage other people’s code.) It is NOT OK however to do the same as everyone else, unless you don’t a problem on getting your sites banned for leaving evident footprints.
June 16th, 2007 — Black Hat Seo, Deindexing, Footprints, Templates
As we mentioned, creating your own templates is a must when it comes to generating Black Hat Sites. It is a basic step to remove footprints that could otherwise get you banned from the engines.
Today is saturday, and I don’t really like to work saturdays. But I think I’ll manage to make myself some time to create a basic template and start cranking sites.
Soon I’ll talk about how to save yourself some time and create a gazillion templates automatically. This will improve drastically your automation which is one of the key components of any Black Hat Business Plan.
June 15th, 2007 — Adsense, Black Hat Seo, Deindexing, Footprints, From 0 to 100 a day - Case Study 1, Google, Indexing, Link Spamming
We talked about going from $0 to $100 a day mass generating information rich sites. However: how do you make sure your sites are earning today and also tomorrow, and the day after, and the day after that, and next christmas?
If you want to keep your income more than what it takes to take a piss, then you need your sites to stay in the index forever. Well, maybe not forever, but I guess at least 10 times what it takes you to generate them and promote them. Less than that and we are running the rat race, and that’s what we re trying to get away from, right? 9 to 5s?
So how do I keep these 10 sites in the index?
- not link spamming heavily (and keeping the link spamming niche-targeted)
- having original content (not human written content, but original)
- avoiding footprints (footprint sources are: page generator, templates, linking patterns)
- avoiding n aggressive Adsense CTR
In theory if you achieved all these to its full extent your sites would stay in the index forever. But we all know how moody the big G is. However I think these steps will give your sites real staying power so you can sleep somewhat peacefully.
June 15th, 2007 — Black Hat Seo, From 0 to 100 a day - Case Study 1, YACG
To go from $0 to $100 a day using a free open source script to generate automated sites.
- YACG (Yet Another Content Generator) – this is the free open source script I was referring to
- Blog and Ping
- Link Spam
- Web 2.0 Sites
- Content: Spun content mixed with some of YACG content modules. Spun content will be the main focus though.
- Page Generator: as mentioned, YACG
- Templates: Original templates for each site I build
Contextual Advertising, CPA, and Lead Generation. I hear you screaming: What?? What are you talking about? Just toss up big adsense blocks in there! Yes that would be a fastrack to getting your adsense account banned. And I have nothing against people trying to get their adsense account banned by I rather future proof my revenue streams. We’ll talk about this later on, don’t worry.
I don’t know yet how many sites, indexed pages and visits per day I need to get to $10 a day. I could speculate and talk a lot of bullshit. But I rather wait and tell you how it goes. There are no pure maths here. I could make $100 a day with one site alone, with 100 or with 1,000.
Let’s just speculate the following:
- Adsense: If each site receives 100 targeted visits each day and has a CTR of 10%, that’s 10 clicks that will equate to something close to $1 a day. Then you need to have 100 sites with the same characteristics.
- CPA: There’s more money to be made in CPA, but you need to know about your market, to offer your visitors the right kind of product that they want to sign up for. Some CPA offers convert at 1% and some convert at 30%. Know your audience and you’ll be fine. So how many visitors we need to make $100 a day with CPA affiliate offers? Well let’s assume a CTR in my pages for this specific banner or text link is 10%. This means I need to receive 1,000 visitors in my pages to send 100 potential conversions to the affiliate landing page. Now let’s assume a conversion of 5%. So 5 visitors will ultimately convert and turn into completed sales. Now assuming each sale leaves me a net of $10, I made $50 out of those 1,000 visitors. That is 400% more than I would have made with adsense. Given these CTRs and conversions i would need 60 of these sites to make $100 a day.
- Lead Generation: Now let’s assume instead of trying to get the visitor to click on an adsense ad or an affiliate banner, I want to get them to give me their email address so I can build a list. Then I’ll be able to email them with CPA offers that they agreed to receive. I will also be able to email them links to articles displaying adsense ads and affiliate links. the beauty of this last method is that I stop depending 100% on search engine traffic because I now have another way of reaching my audience.
With all this in mind, what I plan to do is to build a mixed combination of all these types of sites until I get to the point where I am making $10 a day from these sites alone. At that point, do some maths, re-evaluate, rinse and repeat*10.
In my next post I’ll talk about avoiding going one step forward and two steps backwards. In other words: how NOT to get your ass banned from the Big-G.
June 15th, 2007 — Black Hat Seo, Blogging
I haven’t seen around many blogs focused on real case studies for people starting out in this business. For example:
How to go from $0 to $100 a day generating sites with a free open source script.
There are many blog posts around claiming that you can make such and such by doing this or that. But these are usually one time posts with lack of details and generally the blogger didn’t follow up on that lead him/herself.
So, even though the main focus of this blog will be case studies. I’ll still do the occasional post about the latest Matt Cutts blog entry or something Quadszilla, Graywolf or Shoemoney wrote about. Because these guys are just too good to miss. And some others too.So let’s make that my very first case study to follow up with this blog: How to go from $0 to $100 a day generating sites with a free open source script.
I will be as transparent as possible taking into account that this is a public forum.