Entries from June 2007 ↓

How Aristotle Can Help You Make More Money Online

Don’t worry I won’t go in detail about Aristotle’s philosophical legacy, even though it shaped western civilization for more than 2000 years after he died.

I’ll just say this. This dude realized that you could virtually classify anything in nature with the followig categories:

  • Substance: what something is. For example, you are a gal, or a lad, and your cat is a cat, and your lappy is a freaking lappy. Don’t make me write an essay about it. You got it.
  • Accidents: these are the modifications that substances undergo in real life. Ok maybe you are a gal or a lad, but you are also brunette, or blonde, thin, fat, tall, short, grumpy, smart; your lappy may be a cool ibook or a dirty old pc, lcd screen… well you get the idea. There are nine accidents to substance:
    • Quantity,
    • Quality,
    • Relation,
    • Action,
    • Passion,
    • Time,
    • Place,
    • Disposition (the arrangement of parts), and
    • Rainment (whether a thing is dressed or armed, etc.)

If you find this interesting you go ahead and read more about it.

I dare you to find more accidents to reality. You won’t find any :)

So can this help you? Ok let’s say you are setting up your generic articles on which you are going to inject keywords to make them keyword specific. What are you going to write about? Here’s what people will give you as an example in forums:

“I like KEYWORD very much. I would like to buy some KEYWORD product but I couldn’t find any good place to look around. Finlly I found this website [insert your affiliate link here]”

Those kind of sentences are well and good but they are not nearly enough to build millions of pages. You need to write a lot about something that could apply to ANYTHING you can think of.

If you are with me so far you know where this is going. Thanks much Aristotle Dude, bring on those nine accidents:

Accident 1: Quantity

How much of KEYWORD do we really need nowadays? My friends and I like big amounts of KEYWORD because we find it pleasant.

Accident 2: Quality

Which brings us to the next question. Does your KEYWORD really need to be top notch quality? I personally like imported KEYWORD because there quality is much better but some people don’t really feel the difference.

Etc.

You didn’t expect me to go all the way with the nine accidents did you? Go on lazy ass and do your own homework.

When you go through this routine you’ll have better quality content available to cover every niche you would like to attack. This will bring spiders, indexing, rankings, and money to your pockets.  And you thought Greek Philosophy was useless did you.

Look Mum I’m In The News

It’s funny how theres this small circle of SEO bloggers where everyone knows each other but whenever the mainstream media scoops in that’s something noteworthy.

The gem of the note is that Matt Cutts finally admits that Google Bowling exists, or as he prefers to call it, Search Engine Bowling.

How To Become A Web Millionaire

Shout

:)

Chatting With Alex

So I’m chatting with Alex, the guy I talked about here, about his synonymizer tool:

[12:36] alexf: Even better, if settings for the rewrite of each article could be randomly variated (between certain ranges) to avoid having all the articles rewritten with the same parameters.
[12:36] alexf: but why you need this? :)
[12:37] alexf: if you set good settings any text will be rewritten different
[12:37] Q2hl: i dont know, im a fan of random
[12:37] alexf: even if you trying to rewrite same article twice – second copy will be different form first
[12:38] Q2hl: yes but dont you think rewriting all articles with same amount of nouns, adje, etc. will end up sharing some common pattern
[12:38] alexf: no
[12:38] Q2hl: in the sentence structure
[12:39] alexf: no, not at all
[12:39] alexf: first of all it depends on the source of the articles, if they are from different sources, synonymizer can’t make them look more like each other :)
[12:40] alexf: second – percentage of “mutation” only affects the quality of mutation
[12:41] alexf: the less percent – the better quality, article looks more like original, no silly mistakes
[12:42] alexf: but to make article look different from source, you need to put high % of mutation
[12:45] Q2hl: so for example
[12:46] Q2hl: if you had settings x y and z settings all the articles would have some degree of difference with original articles
[12:47] Q2hl: if youhad settings x-3 y+5 and z-6 the degree of difference with the original article would be different
[12:48] Q2hl: meaning if your settings vary from article to article, your articles would have some variation in respects to legibility
[12:49] alexf: this is not necessary to make article look different
[12:49] alexf: even with same x y z it still will look very different
[12:50] alexf: because synonyms itself will be different each time
[12:51] Q2hl: I can’t help thinking that randomizing these settings would add another layer of differentiation, but Im just stubborn :)
[12:51] Q2hl: do you mind if I post this conversation in the blog?
[12:52] alexf: sure, no problem

Why do I post this conversation? Not because I am trying to make Alex include a feature that he considers unnecessary. Truth be said, I trust his judgement more than mine.

I just thought the discussion was bringing up interesting points. This is what he says;

The quick brown fox skillfully jumps over the lazy dog

  • nouns: fox, dog
  • verbs: jumps
  • adverbs: skillfully
  • adjectives: quick, brown, lazy

Now let’s change 100% on each variable:

Output 1: The fast red wolf rapidly dances over the sleepy can

Now let’s change 100% on each variable except for the adjectives where we’ll change 33%:

Output 2: The speedy brown four legged animal rightly runs over the lazy pet

Now here’s the question we were talking about. Output 1 is 100% different from the original sentence but in Output 2 we have except for:

  • brown
  • lazy

Conclusion?

  • Output 1 is more unique and less readable.
  • Output 2 is more readable and less unique.

Deciding the amount of readability vs. uniqueness that you are going to give to your content is one of the basic decisions you need to make when planning your Black Hat Strategies.

At a first glance I thought being able to randomize this decision upon the rewriting of individual articles was a good idea. Now I can see Alex’s point and agree that is not necessary. I can’t really see a benefit to having one batch of sites with some pages filled with less readable and more unique content, and some pages with the opposite equation.

I just thought the whole thing was worth a post because it makes us think about this very basic element of content rewriting. So the million dollar question is:

Do you want unique, non readable content? Or do you want more less unique, less readable content?

I think that debate deserves a whole new post, or maybe even series of related posts.

Free Synonymizer Tool

Alex offers a free synonymizer. Don’t get intimidated by the russian characters, when you sign up you have the option to switch to english.

Alex is one very smart gentleman who also wears a black hat and offers a range of Black Hat SEO Tools specifically designed to generate doorway websites using computer generatd content that will pass dupe filters.

His synonymizer has some features I haven’t seen before: you can define the percentage of nouns, verbs, adverbs and adjectives that you want to replace. You can also define the minimum amount of characters words you want to have replaced. It’s not meant for manually rewriting, but for quick and dirty automatic rewriting it works very well.

I tried it and it’s spitting out results that look about 90% different from each other on each spin.

Feature suggestions:

  • It would be nice to be able to import a bunch of articles and have them all automatically rewritten.
  • Even better, if settings for the rewrite of each article could be randomly variated (between certain ranges) to avoid having all the articles rewritten with the same parameters.

Well done again buddy :)

Google: Do As I Say Not As I Do

Straight from Treadwatch, noticed by one of the most insightful SEO guys in the industry:

Wow Google what a great algo you’ve got and what a really terrific user experience you’re providing when you occupy 9 out of the top 10 results for a query with your own sites.

Screenshot.

It’s funny how the Big G says one thing and does the opposite.

I tend to agree with Graywolf when he claims, in the thread discussion, the following:

[...] but it’s the tip of the iceberg used to make a point. Google engineers do site review clinics at conferences and scold people who have lot’s of domains with very similar if not identical content. However in practice they do exactly they tell everyone else not to do. They publish guidelines about how not providing original content creates a bad user experience, but again do exactly what they tell everyone else not to do. They keep preaching it’s all about the user experience and providing great results yet for a search term that they describe as “volcanic” in volume they provide almost no value.

Have you ever seen what happens in the serps when you create a piece of linkbait on a fresh domain? All those pages from Netscape, Digg, Delicious and Reddit are likely to rank higher than your own domain. Even though they are using duplicate content.

Congratulations assholes, you managed to consolidate an algorithm that disfavours the small guys who play fair. And then you whine if the amount of Black Hat SEOs is on the rise.

Doorways anyone?

Still Busy In The Workshop – Final Countdown For Blast Off

It took longer than I wanted. But it is totally worth it to take your time and create your own version of a publicly available script. I now have my own content sources and I’m pulling them to YACG pages through a hook that a kick ass programmer I know put together. It only took him a couple of hours, but it is going to make a huge difference setting my sites apart from the rest of the YACG bunch. (BTW dude get a blog up so I can link to you ;-) )

If you want something similar done to your version of YACG, RSSGM or MyGen, or any generator you are using, drop a comment or get in touch with me. I can hook you up with him and you might be able to convince him to code something for you.

Anyways, I’m almost set for action. I just have to look at what domains and hosting I have available and go build some shit.

Do You Belong To The Black Hat Monkeys Community?

The Black Hat Monkeys Community

Here’s the short version of the post below: remove your fucking footprints.

Glad we got that out of the way :) Now for those with nothing better to do. Keep reading:

So you get off your wheels and you grab a copy of YACG (after signing up for their private forum.) You look so cool. You grab a list of a few hundred keyword phrases and you ‘re done. Sit back, plug your link spammer of choice and watch money rolling in.

Right?

Actually, wrong. But thank you for playing. You are now part of the Black Hat Monkeys Community.

You need to remove footprints.

Ah yes, I hear your smart brains going, I need to create my own template. So off you go, you get off your wheels again, still looking cool. Create your own template, plug it into RSSGM, MyGen or YACG, chuck a list of a few hundred keywords, etc., etc.

Now you DO sit back and watch the money roll in.

Right?

For crying out loud: are you freaking insane? How can you expect your site to last more than 24 hours in Google when you are using non markoved wiki scraped content, just like the other 99% of members of the Black Hat Monkeys Community. Go read a book, get drunk, run a marathon or two, and see if you come up with any ideas. Learn all about RSS, Content Translation, Synonym Replacement, Markov Filters, Syntax Rules and other Language Mutation Techniques.

Think for yourself. Read this post. Think about it. Now read this one. Now you go and read this blog for 6 months and then come back. You may be a couple of steps closer to becoming a successful Black Hat SEO.

If you are not a coder, it’s OK to LOPC (leverage other people’s code.) It is NOT OK however to do the same as everyone else, unless you don’t a problem on getting your sites banned for leaving evident footprints.

Busy in the Workshop, Building Templates

As we mentioned, creating your own templates is a must when it comes to generating Black Hat Sites. It is a basic step to remove footprints that could otherwise get you banned from the engines.

Today is saturday, and I don’t really like to work saturdays. But I think I’ll manage to make myself some time to create a basic template and start cranking sites.

Soon I’ll talk about how to  save yourself some time and create a gazillion templates automatically. This will improve drastically your automation which is one of the key components of any Black Hat Business Plan.

How NOT to Get Your Black Hat Sites Banned From Google

We talked about going from $0 to $100 a day mass generating information rich sites. However: how do you make sure your sites are earning today and also tomorrow, and the day after, and the day after that, and next christmas?

If you want to keep your income more than what it takes to take a piss, then you need your sites to stay in the index forever. Well, maybe not forever, but I guess at least 10 times what it takes you to generate them and promote them. Less than that and we are running the rat race, and that’s what we re trying to get away from, right? 9 to 5s?
So how do I keep these 10 sites in the index?

  • not link spamming heavily (and keeping the link spamming niche-targeted)
  • having original content (not human written content, but original)
  • avoiding footprints (footprint sources are: page generator, templates, linking patterns)
  • avoiding n aggressive Adsense CTR

In theory if you achieved all these to its full extent your sites would stay in the index forever. But we all know how moody the big G is. However I think these steps will give your sites real staying power so you can sleep somewhat peacefully.