PDA

View Full Version : New website - testing please



torstan
04-05-2009, 04:07 PM
Hi. So I've finally built myself a website that can be found here:
http://www.fantasticmaps.com/

I'd be very grateful for any testing people could do on different web browers (I have firefox and IE but noe of the safari, netscape, chrome etc). Any thoughts, broken links, spelling errors or other screw ups gratefully appreciated.

Also, I need a logo that goes with the name. So far I am drawing a blank!

Gandwarf
04-05-2009, 04:23 PM
I am missing your picture in the "About" section :)

Gamerprinter
04-05-2009, 04:26 PM
Because my IE exploded at work, I've been using Sea Monkey, which is a derivative of Netscape - it works fine on that browser. I checked all the links, read the pages for spelling errors and didn't notice any.

Nice job, simple site layout - black background, white text is pretty standard for artistic sites, though not my favorite. I prefer white background, black text, so things can be read easier, but then that's more generic. I can understand why you wouldn't want that.

GP

PS: logo wise, I see using some of your more elegant font label styles for a map with a closeup of your maps as a background - that would work, unless you're looking for something more unique.

torstan
04-05-2009, 06:37 PM
Thanks for the check and the thoughts. I'm going to doodle with a few ideas this afternoon.

As for the black background I wanted a background that would show off the pictures, and white's a bit harsh for that.

Greason Wolfe
04-05-2009, 06:41 PM
From an aesthetics point of view, I think it looks nice. Not too cluttered and easy to navigate. The coloring works well given the nature of the site and there's no javascript/java/flash involved. A definite plus in my book. The roll-overs are subtle and everything seems to resize properly for those of us with vision problems. Nicely done.

On the technical side of things, there are a few mark-up errors as well as a few parsing errors in your CSS.

http://validator.w3.org/check?verbose=1&uri=http%3A%2F%2Fwww.fantasticmaps.com%2Findex.htm l

http://jigsaw.w3.org/css-validator/validator?profile=css21&warning=0&uri=http%3A%2F%2Fwww.fantasticmaps.com%2Findex.htm l

But nothing that I can tell is going to cause any major problems in most modern browsers, though I, like you, tend to stick with Firefox and IE.

All in all, is good to see. Once I settle a few issues here at home, I'm gonna start rebuilding my web-sites. I mean, it isn't like I don't have enough on my table as it is, right?

GW

Turgenev
04-05-2009, 06:50 PM
I like the site, torstan! Nice and simple and to the point. I would give you some rep for that alone but I need to spread it around a bit more before I can. ;)

A few technical (mostly SEO) hints I would recommend are:

* include an 'alt' in your image tags. For example,

<img src="Gallery/Recent/DreestonThumb130.jpg" alt="Dreeston Image Link" />
The alt tag will help people who have images turned off (for one reason or another) or for some reason the image doesn't load properly (the alt text will get shown instead). Course a SEO (Search Engine Optimization) trick is to put one of your keywords in the alt area since Search Engines pick up on this. ;)

* include a 'title' in the link tags. For example,

<a href="Gallery/Regional/dreeston.html" title="Dreeston Gallery">
In some browsers, the title text will pop up when you mouse over the link. It is also a good SEO trick since Search Engine pick up on the title tags.

* another SEO tip is meta-tags. There is some discussion among the SEO community if meta-tags are still relevant but I personally think they don't hurt and may end up helping out in the long run. Meta tags I would recommend are:


<meta name="author" content="Jonathan Roberts" />
<meta name="robots" content="all" />
<meta name="revisit-after" content="7 days" />
<meta name="title" content="Fantastic Maps: The Cartography of Jonathan Roberts" />
<meta name="description" content="The Cartography of Jonathan Roberts." />
<meta name="keywords" content="fantasy cartography, role playing game maps, rpg maps, D&D maps" />
The author tag is self evident. The robots tag and the revisit-after tag are instructions that search engine robots are to follow when visiting your site (for cataloging purposes). You should think about having a robots.txt file on your site as well since search engines will be looking for it. There is a lot more info about the robots meta-tags and text files here (http://www.robotstxt.org/).

The title and description tags are important because this info is what search engines often pick up on when they display your site in search listings. A SEO trick is to put some of your keywords into your title (which you did by adding cartography to your title). Speaking of keywords, they are important as well. These are the phrases people use when searching so you want to have the right terms listed. keywords are losing their strength in search engine results but they aren't obsolete yet so I recommend using them. Getting the right keywords is an art. Specific keywords (ie rpg maps, fantasy maps, d&d maps, etc) are more likely to bring people to your site who are looking for those specific terms over general keywords (ie maps, cartography, rpg, etc). I recommend having a balance of both and no more than 25 such keywords. I could write pages about keywords and their use.

Finally I also recommend having a HTML based sitemap that collects are your links in one location. This can be handy for the user and search engines love it. Speaking of search engines, I also recommend having a sitemap.xml file on your site as well. Like the robots.txt file, search engines pick up on this file and it helps with your listing in search engines. More info about sitemap.xml files can be found here (http://www.sitemaps.org/protocol.php).

The reason I mention all of this stuff is because building a site isn't enough these days. The trick now is to drive traffic to it. The above tips will give your site an edge over similar sites that aren't using these tips. Course things like links to your site is where you build your search engine love but that is a whole different tutorial. ;)

Best of luck on your site and commissions!

Turgenev
04-05-2009, 06:54 PM
On the technical side of things, there are a few mark-up errors as well as a few parsing errors in your CSS.

http://validator.w3.org/check?verbose=1&uri=http%3A%2F%2Fwww.fantasticmaps.com%2Findex.htm l

http://jigsaw.w3.org/css-validator/validator?profile=css21&warning=0&uri=http%3A%2F%2Fwww.fantasticmaps.com%2Findex.htm l
These errors are minor mistakes that can be put to right easy enough. If you need any help, drop me a line torstan. This is one of my specialties.

Greason Wolfe
04-05-2009, 07:02 PM
These errors are minor mistakes that can be put to right easy enough. If you need any help, drop me a line torstan. This is one of my specialties.

Heh. I figured they were probably just typing errors. I'm notorious for doing things like that in the process of writing mark-up and css and then pull my hair out trying to find them. Certainly not my specialty, though, so I'm glad there's someone here that can help our man Torstan out with these things better than I'd be able to.

GW

torstan
04-05-2009, 07:09 PM
Thanks a lot guys, that's a full website education right there! I'll work through those points to make this more solid. My challenge right now is to get above the current freelance mapper that comes up in Google when I search for fantastic maps :) (I shall not link him here as that will only increase the task ahead...)

That's really useful. Rep to you both, and I'll make sure I spread some around so I can rep you again Turgenev when you put your castle in the finally finished section.

Gamerprinter
04-05-2009, 07:35 PM
Everything, Turgenev mentioned exists on the Gamer-Printshop.com site as well, didn't mention it, as its been so long since I applied them to the site. I did all that at site creation and pretty much haven't touched the site since (notice those broken links...)

Strange, but once I placed the keyword-optimized pages up, the Robots.txt and the XML file, in 24 hours gamer-printshop.com was at the top of the search engines and have stayed there ever since, with no post-build optimization applied at all.

So its probably wise to do all the TG says, it worked for my site!

GP

Turgenev
04-05-2009, 07:44 PM
Heh. I figured they were probably just typing errors. I'm notorious for doing things like that in the process of writing mark-up and css and then pull my hair out trying to find them. Certainly not my specialty, though, so I'm glad there's someone here that can help our man Torstan out with these things better than I'd be able to.

The other thing to take into account with validation is errors often have a cascading effect. It might say there are 11 errors, but when you go into the code and fix a couple mistakes at the top of the code and then re-validate, it can then pass.

I've done typos in my code in the past and I've scratched my head trying to figure out why a site isn't validating only to look and see that I transposed a couple letters somewhere in the code (for example, typing ahref instead of a href). The http://validator.w3.org/ site is invaluable as a resource.

The top two reasons why a validated site is a good idea are:
* It means the code is correct and will allow better browser cross-platform capability (it will work on more than one type of browser). Sites dedicated to just one type of browser could potentially ignore/alienate future viewers/clients.
* A site will validated code will have an edge against a site with errors in the search engine game. New sites can take a little while to get listed in Google (the regular estimates are anywhere between 3 - 8 months depending on how competitive the topic is). So having a well written site will get you listed quicker and often at higher listings than a non-validated site. Course there are many variables involved in this process so being weak in one area may balance out with a stronger showing in another area.

Turgenev
04-05-2009, 08:02 PM
Everything, Turgenev mentioned exists on the Gamer-Printshop.com site as well, didn't mention it, as its been so long since I applied them to the site. I did all that at site creation and pretty much haven't touched the site since (notice those broken links...)

Strange, but once I placed the keyword-optimized pages up, the Robots.txt and the XML file, in 24 hours gamer-printshop.com was at the top of the search engines and have stayed there ever since, with no post-build optimization applied at all.

So its probably wise to do all the TG says, it worked for my site!


Glad to hear it worked for you. That fact that it worked in 24 hours is amazing! That is outstanding results! I wish some of my sites had that turnover rate. The thing about the search engine game is it is best played as a long term one. The longer your site is online, the better your results should become. Time (as in how long your site has been active) is one of the factors that search engines take into account.

I recommend getting webstats for any site so one can see who is visiting, where they are coming, what browsers they are using, what links referred them to one's site, and so on. This info is interesting to see but it also lets you to who is talking about your site (through the referral links). The real good ones will show you what keywords people used to get to your site. You can then use this info to pinpoint your own keywords and your internet marketing strategy.

One of the biggest mistakes people make with their websites is they put them up and then they never add any new content - basically the sites remain static. This can hurt you in search engine results because the search engine might think your site is no longer active (aka alive) and they might start to de-list your site in the ratings. This won't be a problem for torstan's or GP's sites since they will be adding new content as they produce/accept it (as the case may be). I've seen too many business sites that do this.

In this case, I recommend one should always keep their sitemap.xml file up to date with the latest changes (make changes to the date on the pages you've changed) because search engines will pick up on this. Google and Yahoo allow you to make an account there and upload the sitemap.xml file directly tyo them to help your site get indexed quicker/kept up to date.

The other nice thing about the robot.txt file (combined with info from webstats) is if you're getting hit by robots from dubious sites, you can ban those robots from accessing your site.

I've only touch the tip of the iceberg about SEO. I'm available for consultations (when I'm not mapping or writing xHTML). That's enough SEO stuff from me. I've got some mapping to do. :P

torstan
04-05-2009, 08:43 PM
Well that will certainly keep me going for now :) I have some reading up to do about css and html errors.

Turgenev
04-05-2009, 09:57 PM
Well that will certainly keep me going for now :) I have some reading up to do about css and html errors.
To give you a head's up... try the following code after your title tag and then see how things validate.

<meta http-equiv="Content-type" content="text/html;charset=UTF-8" />

JoeyD473
04-06-2009, 12:25 AM
I am using Opera and it appear to work correctly.

Greason Wolfe
04-06-2009, 12:56 AM
The other thing to take into account with validation is errors often have a cascading effect. It might say there are 11 errors, but when you go into the code and fix a couple mistakes at the top of the code and then re-validate, it can then pass.

I've done typos in my code in the past and I've scratched my head trying to figure out why a site isn't validating only to look and see that I transposed a couple letters somewhere in the code (for example, typing ahref instead of a href). The http://validator.w3.org/ site is invaluable as a resource.



No argument from me on that count. More often than not, that is exactly the case with my code as well. One misidentified ID, one set of transposed letters, an "=" instead of a ":" and you end up with a dozen errors, but that one little fix is all it takes. The validator at w3c is my friend and savior, oh yes it is. :twisted:

Still, being that it is your area of expertise and my area of hobby, I'll defer to you when it comes to helping Torstan tweak out his sight. Better to get it right from someone more in the know than someone who might just be guessing.

GW

altasilvapuer
04-09-2009, 12:21 AM
Hi. So I've finally built myself a website that can be found here:
http://www.fantasticmaps.com/

Also, I need a logo that goes with the name. So far I am drawing a blank!

For me, I would look through all of my maps, and try to identify something that's distinctly "you" in them - your signature, so to speak. Build the logo from that. For instance, say your signature is your use of subtle and slightly understated labeling that blends with the picture well while still being defined (something I actually see in larger your maps quite well, especially in Dreeston and St. George's island). Or maybe if you want to highlight more of your battle-style maps, you could use a small piece of one of them.

Then I'd take said small piece and likely lay it in the titlebar opposite the page title.

At least, that's what I see in my head. I usually build things like this based on what's in my head, and then start seeing the things that actually work and end up changing everything. ;)

-asp

Steel General
04-09-2009, 09:04 AM
CSS - A wonderful tool that just makes my teeth ache for some reason.

ravells
04-09-2009, 09:19 AM
<sniff> no link to the Cartographers' Guild? <cries>

Seriously, very slick looking website! All the links I clicked worked fine. Hey, I didn't know that you did Traveller stuff!

torstan
04-09-2009, 09:38 AM
Thanks for the comments everyone. SG - why does css make your teeth ache?

No links to anyone yet - though I certainly need to fix that. Thanks for the nudge!

It seems to be getting attention already - two people have got in touch through the site and it only went up on Sunday.

As for Traveller, yes I did the area maps for Prison Planet and a couple of deckplans for Warships of Babylon 5 - along with Turgenev. I should really put some of them up on the site.

@asp: Thanks for the suggestion - that's a good idea. I've been wondering about the different possible logos, and whether they confine you to a particular genre, but I think there's actually enough fantasy map work to go around, so I might just go with something unashamedly fantasy-esque. No great rush right now.

Steel General
04-09-2009, 01:08 PM
I don't know, probably just me. As I said above its a great tool and makes things immensely easier to maintain/change. But for some reason anytime I have to create/edit a CSS file it just irritates me, like finger nails on a chalkboard.

I guess I'm just weird that way...what do you expect from someone who turns a warforged into the easter bunny! :P

torstan
04-09-2009, 01:57 PM
Well that does take a steampunk fetish to places best left untouched...

I agree about the css though. It's a great tool, but it is less that intuitive. Perhaps that's just because my editors don't have syntax highlighting for css. I should sort that out.

StillCypher
04-10-2009, 01:50 PM
Very attractive site for your very attractive work! ::thumbs up::

Nomadic
04-10-2009, 03:27 PM
The other nice thing about the robot.txt file (combined with info from webstats) is if you're getting hit by robots from dubious sites, you can ban those robots from accessing your site.


Bad bots tend to ignore the robots.txt file. The best way to block bad bots is to IP ban them. If however it is a bunch of copies on different IPs you might have to contact your host about setting a limit on how many pages can be accessed by one IP in a given time period (bad bots like to spam page requests really fast).

Turgenev
04-10-2009, 04:20 PM
Bad bots tend to ignore the robots.txt file. The best way to block bad bots is to IP ban them. If however it is a bunch of copies on different IPs you might have to contact your host about setting a limit on how many pages can be accessed by one IP in a given time period (bad bots like to spam page requests really fast).
Very true. I was going to mention about bad bots ignoring the robots.txt file but I figured I covered the basics and didn't want to get bogged down by technicalities. One can use the htaccess file to ban bots, IP addresses, and hot-linking (to name the top three) but that is probably getting a bit too technical for here. ;)