SEOs do not necessarily have a technical background, so for the marketer it is important to at least have some basic HTML and programming knowledge. This will allow you to do your job without having to get a programmer involved for some simple SEO implementation.
For starters, SEOs should feel comfortable using a FTP program like FileZilla to pull down or copy website files. For a designer or programmer it may seem like something that is overlooked, but to a marketer who has never had to worry about connecting to an FTP site, it is a foreign experience.
Adding Meta Tags
For websites that use a CMS, it should be a relatively easy process to update title and meta tags in the backend. Usually there is a section in the CMS for this, and does not require any type of HTML coding. However for websites where you have to edit the tags in the backend code, you will need to feel comfortable looking at the code and making very minor changes. After you make the edits you should save and push the file from the local to the live site.
Adding Alt Attributes
Adding alt attributes to images is pretty straightforward, but it should not be overlooked. No only should you be able to add alt attributes, but also using keyword rich file naming conventions and assuring the image is pulling from your server will help with load time.
Adding NoFollow & NoIndex Attibutes
Being able to control what is indexed by the search engines is something that all SEOs should be able to control. Of course you can do it from the robots.txt file, but you should also be able to write a noindex, no follow tag depending on the type of page it is.
Adding a htaccess File
Uploading a htaccess file is reletively easy, but knowing how to program a redirect for SEO purposes is another story. To fix the www vs non-www canonical issue on a php site, you will need to control it in an htaccess file. I think the best way for an SEO to do this, is to have a programmer create a template file that you can simply change depending on the site you are adding the file to. So all the SEO has to do it change the domain. Be careful, that the site doesn’t already have an htaccess on it, or you can break the entire site (I learned the hard way).
Adding a Robots.txt File
A robots.txt file will allow you to control what pages and directories get indexed. For an SEO they should be able to identify which directories should be indexed by the search engines and what should be blocked. Creating a robots.txt is pretty easy and should be done by an SEO.
I am interested to hear from everyone else to see what other technical aspects you feel a SEOs should be able to accomplish. Please let me know what you think.
wow. finally, I discovered one thing helpful for my paper to put in writing about. that is attention-grabbing and helps me with more research in the future. Glad I discovered this blog.Thank you. And I do hope you will develop a few of your ideas about this topic and I'll certain come again and browse it. Thanks for the hassle and time.
For on page SEO, I'd say that you mostly have it. The only thing I'd add is the ability to link posts and pages together to form interlinking relationships on your own site. Nice to see that I'm not the only one that includes .htaccess and robots.txt as an important piece of the puzzle.
Thanks Mark for the post I've been implementing it on my wifes site gogo&co http://www.gogoandco.com
Great wrap up of the basics. I'm not a coder or a programmer, but I've gotten comfortable with the above tasks. Thanks so much for compiling this!