By the way, by secure I do not mean, for the HTTPS sites, but to those sites which have forms authentication. I recently had a situation where I needed to make a request to a web page. The page loads default user credentials from configuration (I know it is not secure, but its an intranet site, for internal use only). So the requests to be made was from a console application. So I thought, using HttpWebRequest would be good enough. But for subsequent requests, the user was not longer considered to be authenticated. But if I simulate the same using a Web Browser, then everything works fine. So I identified that ability to support cookies between requests was what missing, since I know my page does not differentiate requests using browser type. So, I wanted to make a request and support cookies at the same time. There are a lot of options to make a HTTP request in .NET, but I will for now only show the code that uses HttpWebRequest.
void MakeRequest(string url)
{
Console.WriteLine("Making Request to : " + url);
//WebClient wc = new WebClient();
//wc.UseDefaultCredentials = true;
//wc.CachePolicy = new RequestCachePolicy(RequestCacheLevel.Default);
//StreamReader sr = new StreamReader(wc.OpenRead(url));
HttpWebRequest hwr = WebRequest.Create(url) as HttpWebRequest;
hwr.Credentials = CredentialCache.DefaultCredentials;
hwr.CookieContainer = new CookieContainer();
StreamReader sr = new StreamReader(hwr.GetResponse().GetResponseStream());
while (!sr.EndOfStream) Console.WriteLine(sr.ReadLine());
Console.WriteLine("Request Complete");
}
The commented code uses WebClient but id does not work out as expected.
What we did essentially is to create a HttpWebRequest and set Credentials(to support Integrated Security) and CookieContainer. And it works!