Wednesday, July 27, 2011

Top Misc Content on Internet

Top Misc Content on Internet


How to Close Sole Proprietorship

Posted: 24 Jul 2011 10:00 PM PDT

Closing a business for a businessman is a traumatic experience. But there are circumstances beyond control which compel a businessman to shut down the operations. This article discusses all the aspects involved in closing the business legally.

Pros and Cons of Joint Checking Accounts

Posted: 24 Jul 2011 10:00 PM PDT

Marriages are made in heaven but when it comes to money matters, it becomes a touchy subject. Because of this issue, lot of fights ensue resulting in divorces too. Of course every couple wants to make their marriage work and wants to avoid these petty matters. Therefore joint checking accounts are operated. Let's find out about the plus and the minus side to this favorable alternative.

Why Bad Writing is Essential to Good Blogging

Posted: 26 Jul 2011 06:30 AM PDT

image of open dictionary

I’ve been blogging for six years now, and in that time I’ve noticed something — anyone can do it.

At first, I thought that this was a good thing. But then I realized that every good thing has a shadow side.

So here’s the downside of the accessibility of blogging: It makes the already-terrible writers much, much louder.

There are too many bloggers out there.

How can this be a good thing for you?

For too long, the bar has been set way too low with millions of blogs contributing to the noise without adding anything substantive to the discussion.

Our fame-obsessed culture has driven teenagers and baby boomers alike to create their own blogs — all for the sake of being heard. They’re taking up space with half-formed opinions and rants, and it’s given the blogosphere an infamously bad name.

But now, there’s a new phenomenon: The prolific, mediocre blogger.

This person actually understands the basics of SEO and social media and can attract a decent readership.

The problem, though, is that their content sucks.

This probably drives you real writers completely nuts. But maybe it’s not all bad.

Here are three reasons why these awful wordsmiths can actually make you a better blogger.

1. Envy leads to action

Be honest: part of the reason why you hate these champions of mediocrity is jealousy.

Because if you’ve stuck around the Internet long enough, you’ve seen how even a terrible writer can build his own tribe.

You’ve seen spam queens go into six digits on Twitter and typo-ridden articles go viral on Facebook.

And this pisses you off (and it should.)

But we need you to act, not sit back silently judging and mocking. Okay, you can judge and mock too, if you really want.

We need you to move, not lock up out of protest. We need your voice, and we need it now.

Don’t just complain. Act. Fight awful quality with excellence.

2. Competition is (always) good

Social media has, indeed, leveled the communication playing field.

Now, if you have a good story or idea, you can share it, without having to know the right people or possess the right skills.

The days of the gatekeeper are ending.

This, for the most part, explains a lot of the frustration you’re feeling. There are terrible writers out there with nothing to say, and they’re saying a lot … very poorly.

They are stealing away your readers and making them dumber by the minute.

This is actually a good thing.

It forces you to up your game, to woo your followers back to your well-crafted blog. This is not a sprint to the bottom; it’s a marathon to the top.

And those who are truly excellent in their craft and committed to finishing will win in the end.

3. Bad writers need coaches (i.e. you)

The fact that you’re an excellent writer irked by all this mediocrity may be an internal prompting to give back.

More people are blogging, because they recognize the value of building a platform. But they’re breaking the first rule of Copyblogger.

You can help them.

Look at it this way: If you’re really good at writing, you can help others become better writers. Instead of seeing these mediocre bloggers as a threat, why not view this situation as what it really is — an opportunity?

You could begin a writing consulting practice.

You could start coaching amateurs on how to stop sounding stupid and start writing like a pro.

You could help, instead of criticize.

The opportunity is there — do with it what you will.

What do you think? Does this just frustrate you further, or are there some legitimate lessons we can learn from mediocre bloggers?

About the Author: Jeff Goins is a writer and marketing guy who helps people use digital media to amplify their voice. Follow his blog or connect with him on Twitter.

Why Split-Testing is Like Sex in High School

Posted: 25 Jul 2011 06:40 AM PDT

image of apple with lipstick

Everybody's talking about it.

Most of it is rumor, hearsay, and innuendo …

We idolize the exploits of the people we look up to, and try to skirt the mention of our own experiences (and shortcomings).

No, I'm not talking about sex — I'm talking about split-testing.

What does split-testing have to do with sex?

Actually, quite a lot…

Let me explain.

Everybody says they're doing it …

Just like sex in high-school, split-testing is all the rage.

Everyone likes to pretend they're an expert. Buzzwords and rumors abound … stories about increasing conversion rates by an order of magnitude by changing the color of a checkout button (but nobody shares the magic color!).

Most importantly, nobody wants to admit that they don't really know what they're doing, or (gasp!) have never done it themselves. Many join the conversation without wanting to let on that they don't even know what split testing is!

Let's start with a simple definition.

Split-testing, also known as "A/B testing", is an invaluable strategy that compares two versions of a web page, with one difference between them — say, for example, a different headline.

Then you measure how many people take the desired action (like buying a product) on each page, to see which variation works better.

Now that we've explained it, let's be honest.

You don't split-test, do you? Maybe you did something once — a small, unsatisfying and inconclusive experiment, but you're not testing on a regular basis … right?

Most people don't want to admit this, because they feel like they're the only ones not doing it. Everybody knows that split-testing is absolutely critical to effective marketing online — so who wants to admit that they're the only ones who aren't doing it?

Well, relax.

It turns out that "most people" can't be the "only one" — funny thing, right?

Hardly anyone is really doing it…

Everybody's talking about it, but that doesn't mean everybody's actually doing it.

The truth is that many of the exploits that you hear about are fueled by a vivid imagination, rather than experience; only a very small proportion of the talkers are actually doing the things that they describe.

And that's okay … maybe you aren't ready.

To do split testing right, you don't just need to test different variations of a page, you need to measure results, and the differences between the results generated by each variation.

This is challenging, and often impossible for websites that are just starting out and don't have much traffic.

Let's explain why with a short example:

Variation 1: One page page received 974 visits, and 5 people converted
Variation 2: The modified version of the page received 961 visits, and 7 people converted

You'd think that Variation 2 is the clear winner, right?

Wrong.

Crunching the numbers, we find that there is only a 45.27% chance that over time, Variation 2 will continue to outperform Variation 1.

In other words, there's a 54.73% chance that the difference between their success rates was the result of random chance.

Okay … where did I get these numbers?

Split testing is all about finding results that you can be confident in based on statistical significance. This isn't a touchy-feely kind of confidence — it’s calculated mathematically, and you want it to be at least 90%, and ideally 95% or more to choose a winner.

You don't have to worry about calculating the numbers yourself; there are free tools out there that can calculate the statistical significance of your results for you (you just plug in the number of impressions and actions for each variation, and the rest is done for you), and split-testing tools like Google Website Optimizer will do the calculation for you as well (and plugs right into Premise).

If you don't want to calculate the actual significance of your test, here's a rule of thumb that you can use (borrowed from Tim Ash's book Landing Page Optimization):

  • If there are 100 impressions in your sample, you need to see a 20% difference between variations to be sure that they actually mean something.
  • If there are 1,000 impressions, you need a 6.3% difference.
  • If there are 10,000 impressions, you need a 2% difference.
  • If there are 100,000 impressions, you need a 0.063% difference.

Do you notice the trend here?

To detect small differences in improvements (which are what most split-tests are likely to reveal), you need a pretty large sample size.

The moral of the story is that if you don't have much traffic, then maybe you need a solid growth strategy instead of better split-testing.

But what if you do have the traffic?

After all, most sites and blogs have at least a bit of traffic, which is enough to test the more important things, like headlines and opt-in placement.

Most aren't doing it very well …

Like sex in high school, split-testing is something at which even those who are doing it don't have much experience, and their actions are often controlled by impulses and urges, rather than skilled intent.

Let's take a quiz, and see if you're making any of the mistakes of most would-be split-testers:

Do you test one thing at a time? Most wannabe split-testers don't; they change half-a-dozen things at a time, based on the latest and greatest ideas to have entered their minds. The trouble with this is that when things work (or don't work), you don't know which changes are responsible. To effectively split test, you need to isolate variables, which means testing one thing at a time!

(Okay, yes, it is technically possible to test multiple things at once — it's called multivariate testing. In practice, though, doing it requires huge traffic numbers, and a much more complex setup — if you're not already doing it, then it's probably not for you.)

Are you measuring results? I mean actually measuring, with numbers? This is also a rarity — more often, it's an anecdotal "I feel like we're getting more sign-ups" kind of 'measurement'. Be careful with this, because as humans beings we all suffer from a confirmation bias, which means that we're much more likely to favor evidence that supports what we want to believe. Measuring with actual numbers is critical to effective split testing!

Do you let your experiments run until you've reached a 95% confidence level? This is where the greatest number of mistakes are made; an experiment is setup and allowed to run, until the experimenter feels that "this one is working better". This occurs before reaching the point at which the numbers actually prove what you're trying to prove, which means that the results are really inconclusive, and can't be trusted. And what's the point of doing experiment after experiment if none of the results can be trusted? You absolutely have to let experiments run until you reach a statistical confidence in the results!

Are you tracking your experiments? Rather than flittering from experiment to experiment, keep a journal that documents each experiment, and the lessons that you learned from them. This will prevent you from running repeated experiments that test more or less the same thing, without ever learning your lessons. Setup your experiments as hypothesis tests — each experiment is meant to test a guess about something that you think will influence your audience!

Do you focus your experiments on your conversion goals? There's no point in experimenting just for the sake of experimenting, and yet it's more common than you might believe. There's no point testing something unless you think it will contribute to the conversion goals that you have for your site. So rather than setting up test after test, consider first what your objectives are, and what you might be able to test that will contribute to reaching that objective!

You've probably answered "no" to at least some of these questions, but that's fine — the important thing is to learn and adjust your practices, so that the experiments that you run tomorrow will be more effective and fruitful than the experiments that you ran yesterday.

Now that you've got the processes worked out, let's talk about some of the things that you might want to experiment with.

Do you feel like experimenting?

Experimentation can be great, but if you're a professional blogger or business owner, you're not just in it for the fun — you need to focus on the experimentation that will be most gratifying to your bottom line.

Here are some of the most important things that you should be sure to split-test:

The headline. This is the single most important thing that you can split-test, because the headline is the first "gateway" that your readers have to pass through. You will lose more people at the headline than anywhere else on the page, so test the headline first.

Opt-in placement, text, and colors. Try different placements of the opt-in box on your site, different calls to action, and different box and button colors. Since you probably get more sign-ups than sales, this is a much better place to start your testing.

The order button text and colors. Experiment with changing the text of the order button (options include "Get It Now", "I Want Access", "Buy Now", "Add to Cart", "Proceed to Checkout", and more), and with the color of the buttons (yellow, red, blue and green are good places to start). This applies to your email opt-in box as well.

The format of the offer. This is a little more work to test, but if you have the option to do it, you might find that a lot more customers are willing to buy one format than another. Experiment with your offer as an ebook, report, video series, podcast training program, infographic and so forth.

The price. This isn't always possible to test, but if it is, you might find that you're leaving a lot of money on the table; it's possible that increasing the price will not affect sales, and it's even possible that increasing the price will increase sales as well!

The style of the introduction. After the headline, the first thing that your audience will read is the opening paragraph. Experiment with different styles — try making bold statements, vs. telling a story about their problem, vs. describing the ideal outcome. See what works best for your audience.

The product imagery. Try different versions of your product picture — you'd be surprised how much of an effect this sort of thing can have.

Trust seal choice and placement. Different audiences will respond to different trust seals, and will want to see them in different places. Good places to test are near the description of your guarantee, and of course near your order button.

Email subject line. This is just as important as the headline of your sales page, particularly if you're using confirmed opt-in, in which abandon rates of 20-30% are common. Split-test the email subject line of your email confirmation messages to make sure that as many subscribers as possible actually get on your list.

There are lots of other things that you could test — for more ideas than you'll ever be able to test, check out the Landing Page tutorials here on Copyblogger.

Getting started with split testing…

If this is the first time you're hearing about split-testing, then your head is probably spinning right now.

That's okay — it's a lot of information to take in.

Even if you've been thinking about split testing for a while (and have even tried a few experiments), you might be wondering about one thing: how to actually get the experiments going.

That's where Premise comes in — it's a drop-dead simple and complete landing page package that plugs right into WordPress, and you can use to:

  1. Generate all kind of landing pages, including templates for Sales Pages, Content (SEO) Pages, Pricing Table Pages, Email Opt-In Forms, Video Pages, Tab Scroller Pages, and Thank You Pages.
  2. Add all kinds of standard elements into your landing pages with the click of a mouse.
  3. Run split tests to make sure that you're incrementally advancing towards your conversion goals!

So enough fence-sitting … if you want to get serious about split-testing, go get Premise and get started!

Okay, over to you …

Have you experimented with split-testing? What has your experience been? Where did you get stuck?

Do you have a Premise success story to share?

About the Author: Danny Iny is an author, strategist, serial entrepreneur, and proud co-founder of Firepole Marketing, the definitive marketing training program for small businesses, entrepreneurs, and non-marketers. Visit his site today to download a free split test checker, or follow him on Twitter @DannyIny.

No comments:

Post a Comment