In this video post you will learn about free SSL certificate and its installation. This free SSL certificate is from CloudFlare but due to lack of installation guidelines and step by step we as a developer ignores this. Using CloudFlare SSL is a great advantage because you get other service for free too. Off-course, with paid plans you will get all the gems. Remember this never affects your SEO ranks.
Now, let's begin talking about this free SSL certificate. CloudFlare SSL sits between your user and your hosting server. So any potential attack is taken care by CloudFlare before they reach your server. So your user will see SSL certificate till their request reaches CloudFlare server.
If you already have a website running without SSL and you want to get this, then i would recommend to take a screenshot of your domain NS, CNAME, A, AAAA, MX records before you try this.
If you don't want any downtime, off-course, then steps would be like: whatever settings you have on your domain dashboard DNS (including CNAME, A, AAAA, MX etc) copy them to CloudFlare and set Flexible SSL option on crypto page. After that wait for 24 hours or email confirmation from CloudFlare regarding SSL issuance.
Once you get confirmation, just update/use CloudFlare NS records on your domain dashboard, that's it. Just wait for DNS propagation for few hours then you will start browsing using SSL.
Now here's the video that will guide you end to end to setup this.
In this post you will find code snippet to migrate your classic Azure VMs that is part of virtual networks. That is, you will learn Migrating Azure VMs on VNet from Azure Service Management (Classic or ASM) to Azure Resource Manager (Resource Manager or ARM). Without too much discussion, let's start step by step.
You need to have PowerShell installed. In the process of migration you first need to sign in your subscription using modern way that is ARM to prepare migration and then sign in your same subscription using classic approach that is ASM to perform migration and commit.
Let's login ARM
Store subscription id into a variable
$SubID = "1345e4-4561-1bd7-55c1-e3848012qw4r" Step 3.
Select subscription using above variable
Select-AzureRmSubscription -SubscriptionID $SubID
Now, let's prepare the migration on ARM, this operation may take few minutes.
Then, select the subscription using subscription id that we stored in variable above.
Select-AzureSubscription -SubscriptionID $SubID
Now, let's get virtual network name that we will in next step.
Get-AzureVNetConfig -ExportToFile "C:\Users\Abhimanyu K Vatsa\Desktop\VNetConfig.xml"
Now you copy the value of VirtualNetworkSite name="demo-vnet-name"
Store the virtual network name in a variable. Remember, we are migrating virtual network that will automatically migration everything that is associated with this network, including network itself, vms, storage, load-balancer, subnet, nic etc.
$VNetName = "demo-vnet-name"
Let's run most powerful command that prepares the move and it may take few minutes.
If you login to portal now, you will notice a new resource group is create with same name but -Migrated as post-fix. Don't worry, if you see everything doubled. Because these are exact same copy of your resources that was in classic. Once you run commit command (given below) your all classic resources will be removed and you will have migrated resources on ARM and every thing working as it was working earlier. Note, this operation may also take some time.
Creating a human friendly URL is one of the important goal
in content management system. Recently one developer asked me this question. Like he
always sees 'id' in URL in MVC application which is not human friendly. He mentioned
stack-overflow example when explaining issue to me, so let’s discuss about this.
In this post you will learn how to use custom preset for Azure Media Service Encoding. But before that let's look at a case study or issue which I faced.
When I uploaded a 55.5 MB mp4 file and encoded with "H264AdaptiveBitrateMP4Set720p" encoder preset, I received following output files:
Look into green rectangular highlighted video files in the image, this looks good according to input file size. But if you look at red rectangular highlighted video files, these are *improved* files for adaptive streaming, which looks useless if you compare with my example "a dark line on my face in video can't be removed by system automatically...make sense". Here I'm trying to understand Azure Media Services encoding permutations but increasing file size 2-3 times larger than input file is never a acceptable deal.
Why I should pay more for bandwidth and storage on these large files, how I convince my clients?
If we do not pass custom preset file to Azure Media Service, then they use their default preset file which has higher bitrates and they don't have better logic to check input file bitrate and based on this decide new bitrates, this is what I understood.
So, I removed higher bitrate encoding from my preset file and this does the work. This has one disadvantage is that, any superb video quality will be loosed because preset is hardcoded for every video file. But, at-least now I have opportunity to write this bitrate dynamically based on video being uploaded through our service. And then send this new bitrate to Azure Media Services. Check the issue log here.
MVP Again J 1st July 6:30 PM IST has a very special meaning to me, this day starts with full of expectations and fast heartbeats which lasts until the moment of receiving email from Microsoft. Earlier we had a trick to know renewal status even before official confirmation, but this is that hole is closed ;)
So, once again Microsoft awarded me with the MVP (Most Valuable Professional) award for the 5th time in a row in the Visual Studio and Development Technologies (earlier it was ASP.NET/IIS) category. I'm honored to be the part of Microsoft MVP Program, this is one of the most prestigious award to me.
Here is the body of the mail that I received:
I'd like to thanks to my family, all friends, Microsoft Indian MVP Group, Biplab Paul (India MVP Program Lead), Gandharv Rawat and my blog readers and followers. A very-very special thanks to 👩 who is supporting me every day.
The main purpose of using caching is to dramatically improve the performance of your application. This is nothing but output caching that means whatever you see is cached, and exact similar things is display to everyone.
I recommend you to read Output Caching in MVC post before you read here, because you should be very careful when using any caching mechanism. Or, if you already know output caching, keep reading here only.
Biggest problem you face
If you display user login status on page which you want to cache, then you need to be careful. With output cache attribute [OutputCache(....)] caches everything on page and it will not exclude caching of some portion like login status.
In the situation, the best caching library you should use is Donut Caching (aka Donut Output Caching). Let’s understand its uses.
Using Donut Caching
The best way to add donut caching to your MVC project is to use the NuGet package. From within Visual Studio, select Tools | Library Package Manager and then choose either Package Manager Console or Manage NuGet Packages. Via the console, just type install-package MvcDonutCaching and hit return. From the GUI, just search for MvcDonutCaching and click the install button.
Excluding from being cached
The package adds several overloads to the built-in Html.Action HTML helper. The extra parameter in each overload is named excludeFromParentCache. Set this to true for any action that should not be cached, or should have a different cache duration from the rest of the page.
@Html.Action("Login", "Account", true)
Here Login is a method inside Account controller, you should define this method like:
public class AccountController : Controller
public ActionResult Login()
Cache rest of the page
The package also include a DonutOutputCacheAttribute to be used in place of the built-in OutputCacheAttribute. This attribute is typically placed on every controller action that needs be be cached.
You can either specify a fixed duration:
[DonutOutputCache(Duration = "300")]
public ActionResult Index()
Or, use a cache profile:
[DonutOutputCache(CacheProfile = "TenMins")]
public ActionResult Index()
If you are using cache profiles, be sure to configure the profiles in the web.config. Add the following within the system.web element:
I faced this error while Azure Web App Deployment starts in after Continuous Integration. Here's the detailed information you see in the error:
-Credential parameter can only be used with Organization ID credentials. For more information, please refer to http://go.microsoft.com/fwlink/?linkid=331007&clcid=0x409 for more information about the difference between an organizational account and a Microsoft account. There was an error with the Azure credentials used for deployment.
If you face this issue, here's quick way:
Edit existing build definition and select "Azure Web App Deployment" and then click on "Manage".
Now you will be redirect on new tab.
On new tab click on "Update service configuration" and this will open a dialog box where you need to select "Certificate Based" option and then fill up the details. You can click on "Publish settings file" to download, this file will have everything you need to fill here.
Now click on "ok" button to save changes and then fire another build and this will work now.