Category: Rant


Retweeting on twitter

I love twitter. I wasn’t always a fan, but I caved. It’s a great source (for me specifically) to keep up with all the goodness that involves programming and JavaScript nonsense. In my short lived life on twitter, I’ve noticed that there are several ways that people retweet a tweet (or as I call them, twats). I’ve categorized them in a neat list so you don’t have to.

Old School
Keep it real, yo. Just use the “retweet” button that the twitter interface already offers. It’s simple and to the point. It doesn’t require any extra typing or thinking. Just hit that magic button and it’s done. This will simply put the original person’s tweet on your followers timeline. It’s indicated as a retweet by including a retweet icon next to the sources name, followed by the name of the person your following (the retweeter).

RT
This is when someone who wants to retweet something and provide their own commentary. For example, if @chickenFace (not real.. yet) tweets:

“I just had the best #chicken in life. I almost shat my pants it was so delicious.”

Someone following @chickenFace would retweet their twat by prepending “RT” and perhaps append it with their own comment like so:

RT “I just had the best #chicken in life. I almost shat my pants it was so delicious.” -@chickenFace I just shat my pants too! #coincidence

This form of retweeting is good for commentary by the retweeter and provides a small level of interaction within a single tweet.

Via
This is similar to RT with the exception that the retweeter gives a shout out to the person who tweeted it. For example, @chickenFace writes:

“If I had another asshole, I would eat more than I already do. Here is my double-ass demo: http://bit.ly/iTNieP

A follower of @chickenFace would then retweet like so:

“Ass http://bit.ly/iTNieP via @chickenFace”

I like this one. It reminds me of when I had to write bibliographies in my papers for school. You can instantly give credit to the original source by simply saying who wrote it.

Link whore
Then finally you have the link whore. It’s perfectly OK to keep track of links clicked. It’s good for business and egos. The only problem is that when it comes to twitter, there are more link whores then crack whores on Times Square in the 80’s.

These are the people who see a link on their timeline and say “Hey, I’m a douche bag who loves clicks! I’ll take this awesome link that I didn’t find myself, then create another link to redirect to that same link. I’m a genius!”. This enables the retwatter to take full credit for a link they didn’t find. It’s like twitter plagiarism. I call it twittergism. (you heard it here first, folks).

Here’s an example from @chickenFace:

“So apparently the chicken came first, it’s true. Check it: http://on.msnbc.com/klAXXZ

Someone following @chickenFace would then be a link whore and do this:

“ZOMG! The chicken came first: http://5z8.info/enriched-uranium-supply_t1h5ax_startdownload

The shitbag above just wants you to click on their link so they can track it and feel great about themselves.

Conclusion
Twitter is fun, don’t be a whore. If you’re gonna retweet, give credit where credit is due.

Nicholas Zakas recently posted his slides from this past weekend’s jQuery conference. Among the great information provided, one slide (#74) really made an impact on me. It basically sums of the fact that browsers vendors, like televisions sets, are consumption devices. Much like TV’s, the producers (web developers) of shows (websites) create a product to be served to all kinds of devices.

Here’s the difference: television producers don’t give a damn what television you are using when watching their program.

Whether you have an old school TV that weighs 300lbs or an HD TV that weighs 50lbs, you’re still watching the same show. Sure, the quality is different, but it’s still broadcasting the same exact way (albeit HD is a higher quality). If it were only that easy as a web developer…

From IE6 to Firefox4 and everything in between, we have to make sure web pages look semi-decent for our users. But dowebsitesneedtolookexactlythesameineverybrowser.com.

No. Hell to the NO. A resounding no effing way.

Yet still we try our best to accommodate for such nonsense. From conditional styling via clever commenting to JavaScript browser sniffing, developers are trying their best to make web content look, and in some cases preform consistently.

“Don’t worry, HTML5 will fix that”.

No. It won’t.

Why? Do I really have to go there? OK, fine..

  • It’s not fully supported on all browsers and won’t be until after the apocalypse.

There. That’s why.

And even when the spec is actually finished, all vendors still need to get it together and comply with the new rules.

Until then, what do we do? We sit at our desks and punch keys all day in order to accommodate for the dinosaurs that are still using IE6. And it’s not only IE that’s the pain in the ass, every browser has their quirks.

Is there a solution to this never ending quest of consistency? Hmmm…

  1. One world browser: That’s right, have only 1 web browser that EVERYONE MUST USE.
    No, that won’t work. It didn’t (or did it..) work for the New World Order, and it won’t work for browsers.
  2. EVERYONE uses the same operating system.
    Nope. Microsoft tried that, look how well it worked out.
  3. Turn off the Internet.
    Yes we can!

With the wide variety of available devices that are able to render an HTML page, it can be overwhelming to think of all the possibilities of how crappy your website will look. I’m convinced that no matter how hard you try, there is a browser out there that will make your webpage look like it was designed by Helen Keller. Hell, just ask your Quality Assurance (QA) team.

Which brings me to my point –  different browser, different experience – get over it. Your QA team needs something to do. What better avenue then to bring up minor discrepancies between browsers.

QA: “This element should have 1px margin more on the left, but only in IE6/7. Other browsers need 2 more pixels. And don’t get me started on how it renders on the iPhone… and Android… and Nook… and Kindle… and <your browser here>”

WTF. Really? How about I take a pen and jam it directly into both of my eyes.

I demand that we (developers) get access to what percentage of users are using specific browsers. I, like many web developers, have spent countless hours on debugging outdated browsers for the most trivial bugs in imaginable.

If less than 5% of your users are using a browser that you spend more than 10% of your time debugging, IT’S NOT WORTH IT.

But hey, if they want to keep paying developers to waste time on such nonsense, show me the money.