Entries Tagged 'Footprints' ↓

Chatting With Alex

So I’m chatting with Alex, the guy I talked about here, about his synonymizer tool:

[12:36] alexf: Even better, if settings for the rewrite of each article could be randomly variated (between certain ranges) to avoid having all the articles rewritten with the same parameters.
[12:36] alexf: but why you need this? :)
[12:37] alexf: if you set good settings any text will be rewritten different
[12:37] Q2hl: i dont know, im a fan of random
[12:37] alexf: even if you trying to rewrite same article twice – second copy will be different form first
[12:38] Q2hl: yes but dont you think rewriting all articles with same amount of nouns, adje, etc. will end up sharing some common pattern
[12:38] alexf: no
[12:38] Q2hl: in the sentence structure
[12:39] alexf: no, not at all
[12:39] alexf: first of all it depends on the source of the articles, if they are from different sources, synonymizer can’t make them look more like each other :)
[12:40] alexf: second – percentage of “mutation” only affects the quality of mutation
[12:41] alexf: the less percent – the better quality, article looks more like original, no silly mistakes
[12:42] alexf: but to make article look different from source, you need to put high % of mutation
[12:45] Q2hl: so for example
[12:46] Q2hl: if you had settings x y and z settings all the articles would have some degree of difference with original articles
[12:47] Q2hl: if youhad settings x-3 y+5 and z-6 the degree of difference with the original article would be different
[12:48] Q2hl: meaning if your settings vary from article to article, your articles would have some variation in respects to legibility
[12:49] alexf: this is not necessary to make article look different
[12:49] alexf: even with same x y z it still will look very different
[12:50] alexf: because synonyms itself will be different each time
[12:51] Q2hl: I can’t help thinking that randomizing these settings would add another layer of differentiation, but Im just stubborn :)
[12:51] Q2hl: do you mind if I post this conversation in the blog?
[12:52] alexf: sure, no problem

Why do I post this conversation? Not because I am trying to make Alex include a feature that he considers unnecessary. Truth be said, I trust his judgement more than mine.

I just thought the discussion was bringing up interesting points. This is what he says;

The quick brown fox skillfully jumps over the lazy dog

  • nouns: fox, dog
  • verbs: jumps
  • adverbs: skillfully
  • adjectives: quick, brown, lazy

Now let’s change 100% on each variable:

Output 1: The fast red wolf rapidly dances over the sleepy can

Now let’s change 100% on each variable except for the adjectives where we’ll change 33%:

Output 2: The speedy brown four legged animal rightly runs over the lazy pet

Now here’s the question we were talking about. Output 1 is 100% different from the original sentence but in Output 2 we have except for:

  • brown
  • lazy

Conclusion?

  • Output 1 is more unique and less readable.
  • Output 2 is more readable and less unique.

Deciding the amount of readability vs. uniqueness that you are going to give to your content is one of the basic decisions you need to make when planning your Black Hat Strategies.

At a first glance I thought being able to randomize this decision upon the rewriting of individual articles was a good idea. Now I can see Alex’s point and agree that is not necessary. I can’t really see a benefit to having one batch of sites with some pages filled with less readable and more unique content, and some pages with the opposite equation.

I just thought the whole thing was worth a post because it makes us think about this very basic element of content rewriting. So the million dollar question is:

Do you want unique, non readable content? Or do you want more less unique, less readable content?

I think that debate deserves a whole new post, or maybe even series of related posts.

Still Busy In The Workshop – Final Countdown For Blast Off

It took longer than I wanted. But it is totally worth it to take your time and create your own version of a publicly available script. I now have my own content sources and I’m pulling them to YACG pages through a hook that a kick ass programmer I know put together. It only took him a couple of hours, but it is going to make a huge difference setting my sites apart from the rest of the YACG bunch. (BTW dude get a blog up so I can link to you ;-) )

If you want something similar done to your version of YACG, RSSGM or MyGen, or any generator you are using, drop a comment or get in touch with me. I can hook you up with him and you might be able to convince him to code something for you.

Anyways, I’m almost set for action. I just have to look at what domains and hosting I have available and go build some shit.

Do You Belong To The Black Hat Monkeys Community?

The Black Hat Monkeys Community

Here’s the short version of the post below: remove your fucking footprints.

Glad we got that out of the way :) Now for those with nothing better to do. Keep reading:

So you get off your wheels and you grab a copy of YACG (after signing up for their private forum.) You look so cool. You grab a list of a few hundred keyword phrases and you ‘re done. Sit back, plug your link spammer of choice and watch money rolling in.

Right?

Actually, wrong. But thank you for playing. You are now part of the Black Hat Monkeys Community.

You need to remove footprints.

Ah yes, I hear your smart brains going, I need to create my own template. So off you go, you get off your wheels again, still looking cool. Create your own template, plug it into RSSGM, MyGen or YACG, chuck a list of a few hundred keywords, etc., etc.

Now you DO sit back and watch the money roll in.

Right?

For crying out loud: are you freaking insane? How can you expect your site to last more than 24 hours in Google when you are using non markoved wiki scraped content, just like the other 99% of members of the Black Hat Monkeys Community. Go read a book, get drunk, run a marathon or two, and see if you come up with any ideas. Learn all about RSS, Content Translation, Synonym Replacement, Markov Filters, Syntax Rules and other Language Mutation Techniques.

Think for yourself. Read this post. Think about it. Now read this one. Now you go and read this blog for 6 months and then come back. You may be a couple of steps closer to becoming a successful Black Hat SEO.

If you are not a coder, it’s OK to LOPC (leverage other people’s code.) It is NOT OK however to do the same as everyone else, unless you don’t a problem on getting your sites banned for leaving evident footprints.

Busy in the Workshop, Building Templates

As we mentioned, creating your own templates is a must when it comes to generating Black Hat Sites. It is a basic step to remove footprints that could otherwise get you banned from the engines.

Today is saturday, and I don’t really like to work saturdays. But I think I’ll manage to make myself some time to create a basic template and start cranking sites.

Soon I’ll talk about how to¬† save yourself some time and create a gazillion templates automatically. This will improve drastically your automation which is one of the key components of any Black Hat Business Plan.

How NOT to Get Your Black Hat Sites Banned From Google

We talked about going from $0 to $100 a day mass generating information rich sites. However: how do you make sure your sites are earning today and also tomorrow, and the day after, and the day after that, and next christmas?

If you want to keep your income more than what it takes to take a piss, then you need your sites to stay in the index forever. Well, maybe not forever, but I guess at least 10 times what it takes you to generate them and promote them. Less than that and we are running the rat race, and that’s what we re trying to get away from, right? 9 to 5s?
So how do I keep these 10 sites in the index?

  • not link spamming heavily (and keeping the link spamming niche-targeted)
  • having original content (not human written content, but original)
  • avoiding footprints (footprint sources are: page generator, templates, linking patterns)
  • avoiding n aggressive Adsense CTR

In theory if you achieved all these to its full extent your sites would stay in the index forever. But we all know how moody the big G is. However I think these steps will give your sites real staying power so you can sleep somewhat peacefully.