Sunday, September 19, 2010

Good Backlinks for your website

What are Backlinks?
Backlinks are links on other websites pointing at your website. Without adequate links pointing at your website it will likely never achieve higher keyword ranking.
How to Find Good Backlinks
Website ranking involves more than just putting your website online. The old addage, "Build it and they will come" is no longer true. You need to not only have quality content on your website but also focus on getting good backlinks but also relevant backlinks. The kind of links you gather and where the link is placed on the other site will also determine the amount of traffic you get and your ranking in the search engines.

Google and Yahoo rank websites according to how many quality backlinks they find pointing at your website. Quality is determined by how relevant the other site is to your site and where the link is placed on their page (within the content is best) and how many other links are on the page (less is better). A large number of links pointing at your site does not guarantee success. You need quality backlinks in order to rise to the first page of results and the more you can find the better.

Google started devaluing traded links in Fall of 2007 and started penalizing directories that used dishonest means to raise their own PR so they could sell links. Google penalized those directories by dropping their PR (Page Rank) to Zero and Keyword Rank so far that they didn't even rank for their own business name. This also affected the rank of sites that bought links from those directories because their link count dropped.

If you have been buying links from directories and your PR or keyword rank is dropping it may be because of Google penalizing those directories. This causes those sites to put rel="nofollow" on the links to protect themselves, thus you are no longer credited for those links you bought which reduces your PR and thus your Keyword Rank.

Google also penalizes sites for getting involved in Triangular Linking Schemes, i.e., "I'll link to your site from my site if you link to my other site" (they do this so it doesn't look like a link trade but Google knows how to spot these also).

Therefore the best way to encourage people to link to your content is to provide information that is useful to your visitors.

What are Quality Backlinks?

* Few links on a page
* A link within the body content
* Link from a relevant site

Google recently changed it's algorighym so that if there is an excess of traded links, purchased links, or if you get involved in triangular linking schemes, or buy hundreds or thousands of links all at once, then it will draw a red flag for Google and then Google may apply a filter to your site to prevent it from ranking well.

An occasional, relevant link trade is ok and having to buy a link in Yahoo directory or another directory that is human reviewed where you have to pay for the review (and not the submission) is OK, but anything in exess can cause you problems with your ranking in Google because they can tell when link gathering is not normal.

Google is having to do this due to too many finding some new trick that helps their keyword rank (like hidden text) and then everyone else follows suit and then Google has to clamp down on that practice — until next time someone comes up with a "bright idea".

So anything you do to try and manipulate your rank in Google can draw a red flag and your site could be banned, penalized or have your keyword rank filtered out, i.e., you won't be able to rank for anything no matter what you do.

Best Linking Practice
The best, and easiest, method to gather links is to provide something on your website that people will want to link to, i.e., if you build bridges offer free bridge building plans, if you sell dessert products offer free recipies of products you don't offer, if you provide a service write up a FAQ about your service, etc. You can also provide a service or product for free if they will provide a backlink to your site . On LWD, I provide a lot of HTML tips, Css Tips, graphics tutorials and articles on keyword ranking and other topics which bring in a lot of my traffic and backlinks without my even asking.

Finding Backlinks For Your Website
Find friends, businesses, individuals, or organizations who are willing to link to your website as soon as it's online, then you won't need to submit to search engines because the search engines can find your link on a site they have indexed already. Some of them may require a link trade but try to get mostly one-way links.

Find Good Backlinks
Google uses age of the site, backlinks as well as PR to rank sites (along with many other factors) so, unless you want to PayPerClick for the life of the site, you need to get gather a lot of links and write a lot of content to get rank as high as possible.

Look for websites relevant to what your site offers but have a different focus, i.e., submit to a gardening site if you sell garden tools.

Another way to find good backlinks is to search for your major keyword in your favorite search engine and take note of which of your competitors is ranking highest. Then search for their domain name and look for sites that have accepted their link then try and submit your website there also.
Other Methods of Getting Good Backlinks

* Find Directories in Your Own Niche Submit to directories in your own niche, i.e., if you sell or offer information about vitamins then look for directories focused on health products.

* Free Submission Directories If you aim for free submissions from directories you may need to request about 10 times as many backlinks as you actually need as very few free links are accepted by directories claiming to accept free submissions. They do this so they won't be penalized for only offering paid links. When submitting, change your anchor text (title of your site) frequently so there is a wide variety of keywords to rank for, however you will probably have to pay for anything other than your official site name.

Also see:
Directories that Pass PR
and
Directories to Avoid
* Write Articles Write quality articles related to the focus of your website and submit to other sites. Provide a link within the article to your site related to the focus in the article -- not necessarily your home page. Do NOT post the same article on your own website or you may be penalized for duplicate content.

* Submit Press Releases Only submit press releases when you have a valid reason for one--any new development on your site or an original focus with your products that the customers on the press release will be interested in.

* Blogging Set up a blog on a separate site related to the topic of your site and provide information of value to your visitors (that you don't provide on your own site) with a link to your website within the post where appropriate. Don't link from your website to your blog or it will be discounted.

* Participate in Forums Find forums related to your website's focus and post often so you will get noticed and build up trust so people will click on the link in your profile or signature and possibly link to your site from their own site or in other forums showing where they found valuable information. Most forums now put a rel="nofollow" on all outgoing links so this may not benefit you other than getting some traffic.

How Many Backlinks is Enough?
That depends on your major keywords. Are they popular? If so, you will have a struggle to beat your competition and may have to resort to PayPerClick along with backlinks.

If very few websites advertise what you offer then you may not need very many backlinks to rank high in the search engines but your traffic may still be minimal because of your site being so specialized. In that case diversify and add more information.

Getting Backlinks is an On-going Process
Your backlinks need to be replenished each year because some sites along with their backlinks go offline. Your competitors are probably eagerly seeking more backlinks to outrank you so unless you actively seek more backlinks each year your rank may drop.

Also see
Backlinks to Avoid
and
Keyword Ranking Strategy
Pay Per Click is only temporary
Getting good quality backlinks is a lengthy process and can take several months--unless you want to utilize PayPerClick (Google's AdWords) and then it can achieve almost immediate results but those results will end as soon as you quit paying so it's a good idea to gather links at the same time.

If you would like your web site analyzed for keyword ranking please check out the Website Evaluations page. If you are looking for a web designer see the How to Find a Web Designer page first.

Monday, July 12, 2010

GZip compression with ASP.NET Content

After I posted the GZip Script Compression module code a while back, I’ve gotten a number of questions regarding GZip and compression in ASP.NET applications so I thought I show a couple of other ways you can use the new GZipStream class in ASP.NET.


The beauty of this new GZip support is how easy it is to use in your own ASP.NET code. I use the GZip functionality a bit in my WebLog code. For example the the cached RSS feed is GZip encoded which uses GZipStream on the Response.OutputStream fed to an XmlWriter():



Response.ContentType = "text/xml";



Stream Output = Response.OutputStream;



if (Westwind.Tools.wwWebUtils.IsGZipSupported())

{

GZipStream gzip = new GZipStream(Response.OutputStream, CompressionMode.Compress);

Response.AppendHeader("Content-Encoding", "gzip");

Output = gzip;

}



Encoding Utf8 = new UTF8Encoding(false); // No BOM!

XmlTextWriter Writer = new XmlTextWriter(Output,Utf8);



Writer.Formatting = Formatting.Indented;



Writer.WriteStartElement("rss");







Writer.WriteEndElement(); // rss



Writer.Close();

Response.End();





GZipStream is a stream acts like a stream filter so you can assign it to an existing stream like ResponseStream can intercept all inbound data then write the updated data into the actual Response stream. It’s super easy to do.



You should always check however if GZip is enabled which is done by this helper method IsGZipSupported() which looks like this:



///

/// Determines if GZip is supported

///


///

public static bool IsGZipSupported()

{

string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"];

if (!string.IsNullOrEmpty(AcceptEncoding) &&

AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate") )

return true;

return false;

}



This ensures that you don’t encode content when the client doesn’t understand GZip and would be unable to read the encoded GZip content. Note that the code checks for either gzip or deflate so this assumes before encoding you’ll pick the right encoding algorithm.

GZip in Page Content

Speaking of encoding - above was XML and raw ResponseStream encoding, but you can also apply GZip content very, very easily to your page level or any ASP.NET level code (such as in an HTTP handler). In fact, you can use a very generic mechanism to encode any output by using a Response.Filter which the following helper method (also in wwWebUtils) demonstrates:



///
/// Sets up the current page or handler to use GZip through a Response.Filter
/// IMPORTANT:
/// You have to call this method before any output is generated!
///

public static void GZipEncodePage()
{
HttpResponse Response = HttpContext.Current.Response;

if (IsGZipSupported())
{
string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"];
if (AcceptEncoding.Contains("deflate"))
{
Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter,
System.IO.Compression.CompressionMode.Compress);
Response.AppendHeader("Content-Encoding", "deflate");
}
else
{
Response.Filter = new System.IO.Compression.GZipStream(Response.Filter,
System.IO.Compression.CompressionMode.Compress);
Response.AppendHeader("Content-Encoding", "gzip");
}
}

// Allow proxy servers to cache encoded and unencoded versions separately
Response.AppendHeader("Vary", "Content-Encoding");
}



You can now take this helper function and use this in page level code. For example, the main page in my blog which is HUGE (frequently around 500k) uses it like this:



protected void Page_Load(object sender, EventArgs e)

{

wwWebUtils.GZipEncodePage();



Entry = WebLogFactory.GetEntry();



if (Entry.GetLastEntries(App.Configuration.ShowEntryCount, "pk,Title,Body,Entered,Feedback,Location") < 0)

throw new ApplicationException("Couldn't load WebLog Entries: " + Entry.ErrorMessage);



this.repEntries.DataSource = this.Entry.EntryList;

this.repEntries.DataBind();

}



That’s it. One line and the page will now conditionally GZip encode if the client supports it.



If you really wanted to you can take this even one step further and create a module that automatically sets the Response.Filter early in the pipeline and based on that automatically compresses all content.



However, remember GZip encoding applies compression on the fly so there’s some overhead in the GZip encoding – you are basically adding more CPU processing to your content to reduce the output size. If your content is not that large to start with there’s probably not that much sense in compressing in the first place so I don’t think that whole-sale compression of dynamic content is a good idea.


Caching

But one thing that can mitigate the overhead of GZip compression is caching.



Ah yes, both the RSS feed and the home page are usually heavily cached – the content doesn’t change all that frequently so there’s no reason to keep re-generating it right? If you use GZip on your content you have to be careful to cache both the GZipped content and the non-encoded content or else you’ll feed garbage to clients that don’t understand GZip.



So for example the default page has:



<%@ OutputCache Duration="60" VaryByParam="none" VaryByCustom="GZIP" %>



And then a custom Global.asax handler that looks like this:



public override string GetVaryByCustomString(HttpContext context, string custom)

{

if (custom == "GZIP")

{

if (Westwind.Tools.wwWebUtils.IsGZipSupported())

return "GZip";

return "";

}



return base.GetVaryByCustomString(context, custom);

}



Which results in possibly two different versions of the GZipped page being cached.



And there you have it - GZipped content is easy to create now in ASP.NET 2.0/.NET 2.0 and if judiciously applied it can save some significant bandwidth. On my homepage which is close to 500k of blog text content (gotta review that) the GZipped size is 55k or so – nearly a 90% reduction in size. I’d say that’s plenty of worth it, especially when caching is added.

Sunday, April 11, 2010

Image resize code in c#

public byte[] imagereduce(System.Drawing.Image imgInput, int maxImageSize)
    {
        int tWidth;
        int tHeight;
        double widthHeightRatio = (double)imgInput.Width / (double)imgInput.Height;
        // If width greater than height, then width should be max image size, otherwise height should be.
        // Image should keep the same proportions.
        //tWidth = maxImageSize;
        //tHeight = (int)(maxImageSize / widthHeightRatio);
        if (widthHeightRatio > 1.0)
        {
            tWidth = maxImageSize;
            tHeight = (int)(maxImageSize / widthHeightRatio);
        }
        else
        {
            tWidth = (int)(maxImageSize * widthHeightRatio);
            tHeight = maxImageSize;
        }
        //Session["imgheight"] = tHeight;
        //Session["imgwidth"] = tWidth;
        System.Drawing.Image.GetThumbnailImageAbort myCallBack = new System.Drawing.Image.GetThumbnailImageAbort(ThumbnailCallback);
        System.Drawing.Image myThumbnail = imgInput.GetThumbnailImage(tWidth, tHeight, myCallBack, IntPtr.Zero);
        //System.Drawing.Graphics gr = System.Drawing.Graphics.FromImage(myThumbnail);

        //gr.SmoothingMode = System.Drawing.Drawing2D.SmoothingMode.HighQuality;
        //gr.CompositingQuality = System.Drawing.Drawing2D.CompositingQuality.HighQuality;
        //gr.InterpolationMode = System.Drawing.Drawing2D.InterpolationMode.High;

        //System.Drawing.Rectangle rectDestination = new System.Drawing.Rectangle(0, 0, tWidth, (int)tHeight);

        //System.Drawing.Imaging.ImageCodecInfo codec = System.Drawing.Imaging.ImageCodecInfo.GetImageEncoders()[1];

        ////Set the parameters for defining the quality of the thumbnail... here it is set to 100%
        //System.Drawing.Imaging.EncoderParameters eParams = new System.Drawing.Imaging.EncoderParameters(1);
        //eParams.Param[0] = new System.Drawing.Imaging.EncoderParameter(System.Drawing.Imaging.Encoder.Quality, 100L);

        //gr.DrawImage(myThumbnail, rectDestination, 0, 0, tWidth, tHeight, System.Drawing.GraphicsUnit.Pixel);
        ////myThumbnail.Save(Server.MapPath(sSavePath + sThumbFile));
        System.IO.MemoryStream ms = new System.IO.MemoryStream();
        //myThumbnail.Save(ms, codec,eParams);
        myThumbnail.Save(ms, ImageFormat.Jpeg);
        byte[] bitmapData = ms.ToArray();
        return bitmapData;
    }
    public bool ThumbnailCallback()
    {
        return false;
    }
    public bool checkext(string ext)
    {
        ext = ext.ToLower();
        if (ext == ".jpg"){return true;}
        else if (ext == ".bmp"){ return true;}
        else if (ext == ".gif"){ return true;}
        else if (ext == "jpg") { return true;}
        else if (ext == ".png")
        {
            return true;
        }
        else if (ext == ".jpeg")
        {
            return true;
        }
        else if (ext == "png")
        {
            return true;
        }
        else if (ext == "bmp")
        {
            return true;
        }
        else if (ext == "gif")
        {
            return true;
        }
        else if (ext == "jpeg")
        {
            return true;
        }
        else
        {
            return false;
        }

    }

--
www.cinehour.com

Wednesday, April 7, 2010

DATE MANIPULATIONS IN T-SQL QUERIES

Many times when working with date types in SQL we need to remove the time portion and leave just the date. There are a number of ways to accomplish this, but this article will focus on my favorite, the DATEADD/DATEDIFF (hereafter referred to as DADD) method.

For a long time, I knew about this method but I could never remember exactly where all the commas and zeroes and parens and everything went (I always had to look it up) and so I didn't always use it. Until I really sat down and looked at it and figured out how it worked, it was very hard to remember. With this article, I intend to explain the concept behind the method,give quite a few useful examples of how to apply it and a couple of the things to watch out for with it.

My most common usage involves stripping the time portion off of a day or GETDATE(). The DADD method of doing that is:

SELECT DATEADD(dd, DATEDIFF(dd,0,GETDATE()), 0). 

I'm writing this article on February 27, 2010 and as of right now,

GETDATE() = 2010-02-27 17:31:42.670
DADD: = 2010-02-27 00:00:00.000

While that may appear a bit bewildering at first, it's really quite basic. Let's break it down. First, take the inner DATEDIFF portion.

SELECT DATEDIFF(dd,0,GETDATE()) = 40234

What this portion is doing is figuring out the number of days that have passed between 0 (If you cast 0 as a date in SQL you get 01/01/1900) and today. That number is 40234.

The second portion of the equation is adding that number of days to 0. Because it is only adding the days and not the time portion, you get the very start of the day. With that in mind, it's a bit easier to remember the entire thing because you can just start from the datediff and add the dateadd. You're figuring out the number of days between 0 and today and then adding that back to 0.

DATEDIFF(dd,0,GETDATE()) -- Days between 0 and Today
DATEADD(dd, , 0) -- Add that number of days back to 0

The same concept works for many different time calculations. For instance, you can sub out Days for Week, month or Year:

SELECT DATEADD(wk, DATEDIFF(wk,0,GETDATE()), 0)
--: 2010-02-22 00:00:00.000 First day of the week.

SELECT DATEADD(mm, DATEDIFF(mm,0,GETDATE()), 0)
--: 2010-02-01 00:00:00.000 First day of the month.

SELECT DATEADD(yy, DATEDIFF(yy,0,GETDATE()), 0)
--: 2010-01-01 00:00:00.000 First day of the Year.

You can use a value other than zero in the dateadd portion to add or remove time. The below adds or removes 2 days. Because you're literally adding days, you don't need to worry about whether or not you cross over into a different month.

SELECT DATEADD(wk, DATEDIFF(wk,0,GETDATE()), 2) 
--: 2010-03-01 00:00:00.000 Start of the day 2 days from now

SELECT DATEADD(wk, DATEDIFF(wk,0,GETDATE()), -2)
--: 2010-02-25 00:00:00.000 Start of the day 2 days ago.

Note that if you change the values that are 0's in all of these, you are only offshifting by days. What if you want to add/subtract a week or month? You can do that by adding it right after the datediff portion.

SELECT DATEADD(mm, DATEDIFF(mm,0,GETDATE()) +2, 0) 
--: 2010-04-01 00:00:00.000 Start of the Month 2 Months from now.

In case you were wondering, this DOES work even if the days would have overflowed. For instance, adding a month to January 31 would still give you February.

SELECT DATEADD(mm, DATEDIFF(mm,0,'20100131') +1, 0) 
--: 2010-02-01 00:00:00.000 Start of next Month

You can use the 'first' theory to find the 'last' of something else. For example, if you wanted the last day of the prior month, you can start with the first day of the month and then subtract a day.

SELECT DATEADD(mm, DATEDIFF(mm,0,GETDATE()), 0) 
--: 2010-02-01 00:00:00.000 First day of the month.

SELECT DATEADD(dd,-1, DATEADD(mm, DATEDIFF(mm,0,GETDATE()), 0))
--: 2010-01-31 00:00:00.000 Add -1 days (Subtract a day).

You could also have just added the number of months to -1 days, effectively subtracting a day here.

SELECT DATEADD(mm, DATEDIFF(mm,0,GETDATE()), -1) 
--: 2010-01-31 00:00:00.000 Add -1 days (Subtract a day).

When you go smaller than a day you can find the "end" of a day by using milliseconds* (SQL 2000/2005) or (if you use datetime2) microseconds/nanoseconds (SQL 2008). It is very important to note when doing this that the datetime data type is only accurate to 3 ms, not to 1. Subtracting 1 or 2ms here would do nothing as it would round back up to the start of the next day. With Datetime you will only ever have .990,.993 and .997 for ms. If you use datetime2 (available in SQL 2008), you can be accurate to 100 ns. By default though, GETDATE() resolves to a datetime data type and you have to cast/convert the value before you can use mcs or ns with dateadd.

SELECT DATEADD(ms,-3, DATEADD(dd, DATEDIFF(dd,0,GETDATE()), 0)) 
--: 2010-02-26 23:59:59.997 End of the Previous Day(Datetime)

SELECT DATEADD(ns,-100,CAST(DATEADD(dd, DATEDIFF(dd,0,GETDATE()), 0) as datetime2))
--: 2010-02-26 23:59:59.9999999 End of the Previous Day (Datetime2)

While the above queries get you as close to the end of the day as possible for the appropriate types, it is usually advised that you avoid this all together whenever possible. For example, if you wanted all the values for today's date, instead of using >= the start of the day and <= the end of the day, it is recommended that you use >= the start of the day and < the start of tomorrow. This protects you in situations where the types can become more accurate than you were taking into account before. For instance, if datetime became accurate to 1ms (or the field was changed to a datetime2 field) and you were using this method, your queries would suddenly have the potential to miss data for just under 2ms each day.



--
www.cinehour.com

Friday, April 2, 2010

SEO LINK Building

The SEO Tutorial: Link Building


Search engines have always been the primary way of finding information on the internet. And once upon a time, these search engines were many and had names like Lycos, Altavista, and Hotbot. Furthermore, these engines all worked very similarly, listing their results based mostly on website content and keyword use. Of course, this made the results fairly easy to manipulate -- you only had to stuff more keywords in your meta tags and on your webpage. Then one day, a new search engine arose, one with the curious name of Google. Google decided that the current way of finding the best results was lacking. So they focused on a new technique: they began looking at the links that came into your website. Each link, they figured, was a "vote" for your site; the more people that "voted" for your site, the more likely it was that your site was useful and should be listed high in the results for the keywords that linked to it. Thus began the rise of Google, as webmasters tried a multiplicity of ways to get links to their site. A few years back, many sites could manipulate this new link aspect simply by trading links with hundreds of others. But as all smart companies do, Google adapted and nowadays, all links to your site are analyzed to find out what quality of "vote" each link is. In other words, the more quality links you get to your site, the better your site does. So with that in mind, let's consider two points: first, what makes a quality link, and second, how you get these links.


What makes a quality link
Here are a few key aspects of a good inbound link. Some of these aspects are out of your control, but you can still improve how people choose to link to you:


How to get quality links
So how do you get sites to link to you? Or rather, how do you get quality sites to link to you?

Directory Links - This is the easiest way to get links, but perhaps the worst. Do not submit your site to 200 directories -- those links are usually worthless and associate your site with low-quality sites. However, submitting your site to a few quality directories is advised (you can sort of determine a site's quality by their Alexa rating). Directories like JoeAnt.com and BlogCatalog.com fall into this category of "free high-quality directories."

Social Links - When you think you have a great piece of content, submit it to the socialsphere -- sites like Digg, Reddit, del.icio.us, StumpleUpon, and any smaller social sites you know of. Don't bank on hitting it big, but you never know, and furthermore, you will still get a little traffic. Also, on any forums or blogs you are active on, make sure your signature file has a link. Just don't become a forum/blog/comment spammer. It doesn't work, makes you and your site look bad, and takes away time from building good links.

Ask for Links - If you have friends online (and most people do), why not ask for a link? Most people are kind enough to do so and it's rather easy. It gets a bit tougher when asking strangers for links, but if you have some sort of "social link" (members of the same forum, similar website focus, etc.), write a quick email asking for a link. If they like your content, they might link to you -- just don't ask them more than once and stay transparent (anything else is usually insulting).

Pay for Links - In recent years, this has become a popular mode of link-building, although Google frowns upon it (and in a recent Page Rank update, has punished various websites for participating). So proceed with caution, but know that many people use this option. You just have to search for "paid text links" to find a slew of services and information. (There are exceptions to this rule: sites like Best of the Web and Yahoo provide paid listing/reviews that actually do help you in SEO.)

Link Bait - The current buzzword of link-building, "link bait" is any content that gives people reason to link to you. What makes good link bait? Contests, free stuff, great articles (like "101 ways to _______"), or anything else people find worth linking to. Of course, make sure your great idea gets in front of people by utilizing the social websites and any contacts you might have in the industry.


--
www.cinehour.com

High Quality Image Thumbnails in C#

Why it Happens?

Image formats like jpeg may store the thumbnail inside the same file. If we use System.Drawing.Bitmap method GetThumbnailImage, method checks if there's a thumbnail image stored into the file and, if the thumb is found, it returns that thumbnail version scaled to the width and height you requested. If the thumbnail version of the image is smaller then the size you requested to produce, thats when problem occurs. The thumbnails produced become pixelated as we know stretching an image to a larger once reduces the Image Quality.

Solution

First of all you will need to include the reference of following namespaces

using System.Drawing;
using System.Drawing.Design;


Use the following code to create High Quality Thumbnail/Resize the image.

string originalFilePath = "C:\\originalimage.jpg"; //Replace with your image path
string thumbnailFilePath = string.Empty;
 
Size newSize = new Size(120,90); // Thumbnail size (width = 120) (height = 90)
 
using (Bitmap bmp = new Bitmap(originalFilePath))
{
    thumbnailFilePath = "C:\\thumbnail.jpg"; //Change the thumbnail path if you want
 
    using (Bitmap thumb = new Bitmap((System.Drawing.Image)bmp, newSize))
    {
        using (Graphics g = Graphics.FromImage(thumb)) // Create Graphics object from original Image
        {
            g.SmoothingMode = System.Drawing.Drawing2D.SmoothingMode.HighQuality;
            g.InterpolationMode = System.Drawing.Drawing2D.InterpolationMode.High;
            g.CompositingQuality = System.Drawing.Drawing2D.CompositingQuality.HighQuality;
 
            //Set Image codec of JPEG type, the index of JPEG codec is "1"
            System.Drawing.Imaging.ImageCodecInfo codec = System.Drawing.Imaging.ImageCodecInfo.GetImageEncoders()[1];
 
            //Set the parameters for defining the quality of the thumbnail... here it is set to 100%
            System.Drawing.Imaging.EncoderParameters eParams = new System.Drawing.Imaging.EncoderParameters(1);
            eParams.Param[0] = new System.Drawing.Imaging.EncoderParameter(System.Drawing.Imaging.Encoder.Quality, 100L);
 
            //Now draw the image on the instance of thumbnail Bitmap object
            g.DrawImage(bmp, new Rectangle(0, 0, thumb.Width, thumb.Height));
 
            thumb.Save(thumbnailFilePath, codec, eParams);
        }
    }
}


--
www.cinehour.com

Sunday, March 28, 2010

Code to retrieve distinct fields from table


with t1 as
(select top 1000 * from photogallery where category<>'movieposters' and category<>'others'
and category<>'spicy' order by cid desc)

select distinct top 5 name from t1

--
R Chakrapani Raju
Web Programmer
Alip Infotech Pvt Ltd
www.cinehour.com

Saturday, March 20, 2010

Paragraph Trim function

 

 Trim your paragraph to limited number of words. A very useful function for designing text based websites like news or articles archives or list.

 

protected string paraTrim(string Input, int no_of_words)

    {

 

        // split input string into words (max 21...last words go in last element)

        String[] Words = Input.Split(new char[] { ' ' }, no_of_words);

 

        // if we reach maximum words, replace last words with elipse

        if (Words.Length == no_of_words)

            Words[no_of_words - 1] = "...";

        else

            return Input;  // nothing to do

 

        // build new output string

        String Output = String.Join(" ", Words);

        return Output;

 

    }


Monday, March 15, 2010

IIS URL Rewriting

IIS URL Rewriting

When a client makes a request to the Web server for a particular URL, the URL-rewriting component analyzes the requested URL and changes it to a different other URL on the same server. The URL-rewriting component runs very early in the request processing pipeline, so is able to modify the requested URL before the Web server makes a decision about which handler to use for processing the request.

IIS URL Rewriting
ASP.NET Routing

ASP.NET routing is implemented as a managed-code module that plugs into the IIS request processing pipeline at the Resolve Cache stage (PostResolveRequestCache event) and at the Map Handler stage (PostMapRequestHandler). ASP.NET routing is configured to run for all requests made to the Web application.

IIS URL Routing

Differences between URL rewriting and ASP.NET routing:

1. URL rewriting is used to manipulate URL paths before the request is handled by the Web server. The URL-rewriting module does not know anything about what handler will eventually process the rewritten URL. In addition, the actual request handler might not know that the URL has been rewritten.
2. ASP.NET routing is used to dispatch a request to a handler based on the requested URL path. As opposed to URL rewriting, the routing component knows about handlers and selects the handler that should generate a response for the requested URL. You can think of ASP.NET routing as an advanced handler-mapping mechanism.

In addition to these conceptual differences, there are some functional differences between IIS URL rewriting and ASP.NET routing:

1. The IIS URL-rewrite module can be used with any type of Web application, which includes ASP.NET, PHP, ASP, and static files. ASP.NET routing can be used only with .NET Framework-based Web applications.
2. The IIS URL-rewrite module works the same way regardless of whether integrated or classic IIS pipeline mode is used for the application pool. For ASP.NET routing, it is preferable to use integrated pipeline mode. ASP.NET routing can work in classic mode, but in that case the application URLs must include file extensions or the application must be configured to use "*" handler mapping in IIS.
3. The URL-rewrite module can make rewriting decisions based on domain names, HTTP headers, and server variables. By default, ASP.NET routing works only with URL paths and with the HTTP-Method header.
4. In addition to rewriting, the URL-rewrite module can perform HTTP redirection, issue custom status codes, and abort requests. ASP.NET routing does not perform those tasks.
5. The URL-rewrite module is not extensible in its current version. ASP.NET routing is fully extensible and customizable.

Tuesday, March 2, 2010

Create thumbnails for images in c#,Image size reduction, Decrease the size of images in c#

public byte[] imagereduce(System.Drawing.Image imgInput, int maxImageSize)
{

int tWidth;
int tHeight;
double widthHeightRatio = (double)imgInput.Width / (double)imgInput.Height;
// If width greater than height, then width should be max image size, otherwise height should be.
// Image should keep the same proportions.
//tWidth = maxImageSize;
//tHeight = (int)(maxImageSize / widthHeightRatio);
if (widthHeightRatio > 1.0)
{
tWidth = maxImageSize;
tHeight = (int)(maxImageSize / widthHeightRatio);
}
else
{
tWidth = (int)(maxImageSize * widthHeightRatio);
tHeight = maxImageSize;
}

//Session["imgheight"] = tHeight;
//Session["imgwidth"] = tWidth;

System.Drawing.Image.GetThumbnailImageAbort myCallBack = new System.Drawing.Image.GetThumbnailImageAbort(ThumbnailCallback);

System.Drawing.Image myThumbnail = imgInput.GetThumbnailImage(tWidth, tHeight, myCallBack, IntPtr.Zero);
//myThumbnail.Save(Server.MapPath(sSavePath + sThumbFile));


System.IO.MemoryStream ms = new System.IO.MemoryStream();
myThumbnail.Save(ms, ImageFormat.Jpeg);
byte[] bitmapData = ms.ToArray();
return bitmapData;
}
public bool ThumbnailCallback()
{
return false;
}

Customized Message pop up Box in c# for websites

It gives you a pop up msgbox to show error messages or any alerts.

public Label msgbox(string text)
{
Label lbl = new Label();
lbl.Text = "";
return (lbl);
}