Setup a private nuget server

Deploy nuget server project

Nuget Server package on nuget.org:
https://www.nuget.org/packages/NuGet.Server/2.11.3/

The project source:
https://github.com/NuGet/NuGet.Server

deploy the project to IIS, and configure the web.config

1. set the apiKey for pushing packages to the server
2. set packagesPath to store all the packages, the default will be ~/Packages (you need to give write permissions for the app pool user)

Pushing packages

1. using nuget CLI

https://docs.microsoft.com/en-us/nuget/tools/nuget-exe-cli-reference

nuget.exe push -Source {NuGet package source URL} -ApiKey key {your_package}.nupkg

2. using dotnet core CLI

https://docs.microsoft.com/en-us/dotnet/articles/core/tools/dotnet-nuget-push


dotnet pack --configuration release
dotnet nuget push foo.nupkg -k 4003d786-cc37-4004-bfdf-c4f3e8ef9b3a -s http://customsource/

Enable authentication for accessing nuget server

1. enable windows authentication on the server site on IIS
2. create a windows user
3. adding the repository source with username and password (using nuget CLI), it will be saved into the global nuget.config file. (normally in Nuget has a global nuget.config, saved in \Users\%AppUSer%\AppData\Roaming\NuGet)
nuget.exe sources add -name {feed name} -source {feed URL} -username {username} -password {PAT} -StorePasswordInClearText

If you don’t have the username & password, it will return a 401 unauthorized error. In visual studio, it will prompt a dialog asking for credentials.

Restore nuget packages using nuget config per solution

If you work on a new machine, and checkout source code of a project, you will need to configure the nuget source, username, password etc. To enable developers restore the packages and build the project without any hassles, we can create a nuget.config per solution.


<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSourceCredentials>
<AWS_x0020_Nuget>
<add key="Username" value="spnugetuser" />
<add key="ClearTextPassword" value="SearchParty2017" />
</AWS_x0020_Nuget>
</packageSourceCredentials>
<packageSources>
<add key="AWS Nuget" value="http://nuget.searchparty.com/nuget" />
</packageSources>

</configuration>

Then you can call

nuget restore

or

dotnet restore

Cassandra tips

Use short column names

Column names take space in each cell, and if you use a big clustering key, it will be copied all over your clustered cells.

Eventually, we have found in some situations that column names (including clustering keys) take up more space than the data we wanted to store! So it is a good advice to use short column names, and short clustering keys.

You can write data in the future

Using the CQL driver you can explicitly set up the timestamp of each of your key/value pairs. One nice trick is to set up this timestamp in the future: that will make this data immutable until the date is reached.

Don’t use TimeUUID with a specific date

TimeUUID is a very common type for Cassandra column names, in particular when using wide rows. If you create a TimeUUID for the current time, this is no problem: your data will be stored chronologically, and your keys will be unique. However, if you force the date, then the underlying algorithm will not create a unique ID! Isn’t this surprising, for a “UUID” (Universal Unique Identifier) field?

As a result, only use TimeUUID if:

  • You use them at the current date
  • You force the date, but are OK with losing other data stored at the same date!

Don’t use PreparedStatement if you insert empty columns

If you have an empty column in your PreparedStatement, the CQL driver will in fact insert a null value in Cassandra, which will end up being a tombstone.

This is a very bad behavior, as:

  • Those tombstones of course take up valuable resources.
  • As a result, you can easily reach the tombstone_failure_threshold (by default at 100,000 which is in fact quite a high value).

The only solution is to have one PreparedStatement per type of insert query, which can be annoying if you have a lot of empty columns! But if you have multiple empty columns, shouldn’t you have used a Map to store that data in the first place?

Don’t use Cassandra as a queue

Using Cassandra as a queue looks like a good idea, as wide rows definitely look like queues. There are even several projects using Cassandra as a persistence layer for ActiveMQ, so this should be a good idea!

This is in fact the same problem as the previous point: when you delete data, Cassandra will create tombstones, and that will be bad for performance. Imagine you write and delete 10,000 rows, and then write 1 more row: in order to fetch that one row, Cassandra will in fact process the whole 10,001 rows…

Use the row cache wisely

By default Cassandra uses a key cache, but whole rows can also be cached. We find this rather under-used, and we have had excellent results when storing reference data (such as countries, user profiles, etc) in memory.

However, be careful of two pitfalls:

  • The row cache in fact stores a whole partition in cache (it works at the partition key level, not at the clustering key level), so putting a wide row into the row cache is a very bad idea!
  • If you put the row cache off-heap, it will be outside the JVM, so Cassandra will need to deserialize it first, which will be a performance hit.

Don’t use “select … in” queries

If you do a “select … in” on 20 keys, you will hit one coordinator node that will need to get all the required data, which can be distributed all over your cluster: it might need to reach 20 different nodes, and then it will need to gather all that data, which will put quite a lot of pressure on this coordinator node.

As the latest CQL driver can be configured to be token aware, you can use this feature to do 20 token aware, asynchronous queries. As each of those queries will directly hit the correct node storing the requested data, this will probably be more performant than doing a “select … in”, as you will gain the round trip to the coordinator node.

Configure the retry policy when several nodes fail

This of course depends whether you prefer to have high consistency or high availability: as always, the good thing with Cassandra is that this is tunable!

If you want to have good consistency, you probably have configured your queries to use a quorum (or a local_quorum is you have multiple datacenters), but what happens if you lose 2 nodes, considering you have the usual replication factor of 3? You didn’t lose any data, but as you lost the Quorum for some data, you will start to get failed queries! A good compromise would be to tune the retry policy and use the DowngradingConsistencyRetryPolicy : this will allow you to lower your consistency level temporarily, the time for you to restore one of the failed nodes and get your quorum back again.

Don’t forget to repair

The repair operation is very important in Cassandra, as this is what guarantees that you won’t have forgotten deletes. For example, this can happen when you had a hardware failure, and you bring the node back when some tombstones have expired on other nodes: Cassandra will see this deleted data as some new data (as tombstones have disappeared), and thus this data will be “resurrected” in your cluster.

Repairing nodes should be a regular and normal operation on your cluster, but as this has to be set up manually, we see many clusters where this is not done properly.

For your convenience, DataStax Enterprise, the commercial version of Cassandra, provides a “repair service” with OpsCenter, that does this job automatically.

Clean up your snapshots

Taking a snapshot is cheap with Cassandra, and can often save you after doing a wrong operation. For instance, a database snapshot is automatically created when you do a truncate, and this has already been useful to us on a production system!

However, snapshots take space, and as your stored data grow, you will need that space at one time or another: so a good process is to save those snapshots outside of your cluster (for example, uploading them to Amazon S3), and then clean them up to reclaim the disk space.

10 Tips and tricks for Cassandra

Modeling data with Cassandra: what CQL hides away from you

Vagrant

Vagrant provides easy to configure, reproducible, and portable work environments built on top of industry-standard technology and controlled by a single consistent workflow to help maximize the productivity and flexibility of you and your team.

Vagrant stands on the shoulders of giants. Machines are provisioned on top of VirtualBox, VMware, AWS, etc.

I am using virtualbox as an example, you can fire up a ubuntu box with a few lines of code.


# -*- mode: ruby -*-
# vi: set ft=ruby :

Vagrant.configure("2") do |config|

config.vm.box = "ubuntu/trusty64"

config.vm.synced_folder "./data", "/home/vagrant/data"
config.vm.provision "shell", path: "./scripts/vagrant/install-glance.sh"

config.vm.network "forwarded_port", guest: 8983, host: 8984, auto_correct: true

config.vm.network "private_network", ip: "192.168.33.10"

config.vm.provider "virtualbox" do |vb|

# Customize the amount of memory on the VM:
vb.memory = "8024"
end

end

You can forward the port from virtual box to your host, if the port is used with other programs, it can auto fix the port and assign a new one.


config.vm.network "forwarded_port", guest: 8983, host: 8984, auto_correct: true

You can set up a virtual ip for the box.


config.vm.network "private_network", ip: "192.168.33.10"

Setup a sync folder that can be access both in ssh and your host machine.


config.vm.synced_folder "./data", "/home/vagrant/data"

After you have created the VagrantFile, you can call

Vagrant Up

to fire up the box.

Once configuration VagrantFile has been changed, need to call

Vagrant Reload

to refresh the virtual box.

To destroy a virtual box, call

Vagrant Destroy

To access public built vagrant boxes,

https://atlas.hashicorp.com/boxes/search

http://www.vagrantbox.es/

Recovering deleted files from an svn repository

To recover a file from svn that you deleted from your local repository, it’s first necessary to get the proper name of the file, and the revision of the repository it last existed in.  To do that (assuming you don’t know, because if you do you have bigger issues), you go to the directory it was in (or as close as you can get to the directory it was in) and run:

> svn log -v --xml > svn-log.xml

You should be able to find the file you’re looking for and the revision you need in the output log file. Assuming your file’s name is ‘myfile.txt’ and it was in revision 1000, you run the following to recover it:

> svn up -r 1000 myfile.txt

 

C# 6 null check

Like nullable types, null-conditional operators can be used now. Just put a ‘?’ (question mark) after the instance before calling the property on top of it. You don’t have to write additional if statements to check for null now. For example, let’s see a simple if condition which we will then see with the null-conditional operator in C# 6.0:

C# 6.0 - Simple Condition (www.kunal-chowdhury.com)[3].png

C# 6.0 - Nested Conditions (www.kunal-chowdhury.com)[3].png

Now let’s see, how it can be used to return a default value if the condition does not satisfy at all. In the below code snippet, you can see that “??” conditional operator can be used along with the null-conditional operator to return a value. In this case, if either of emp (employee object) or MemberOfGroups value is null, it will return –1:

C# 6.0 - Default Values in Conditions (www.kunal-chowdhury.com)[3].png

How OAuthSecurity to obtain emails for different oauth clients, but Microsoft Client doesn’t return email, it didn’t include scope “wl.emails”

I have been playing with MVC 4, SimpleMembership, WebSecurity and OAuthWebSecurity for a while now. I can see the idea of OAuthWebsecurity is a wrapper around DotNetOpenAuth. It registers the clients in AuthConfig.cs file. That really works and helps me reduce heaps lines of code.

But soon enough,

1. I find only Google client returns an email as a username. I mean email is still quite important for the newsletters or system emails etc.

2. For Twitter, they don’t provide email via OAuth or any API, which is a shame. But I don’t complain. (Maybe they have changed without my awareness). So we don’t do anything with it.

3. For Facebook, with DotNetOpenAuth and OAuth2, it actually includes the scope “email” and returns email in the “ExtraData” dictionary.

So in the ExternalLoginCallback() method, you can find this line:

            AuthenticationResult result = OAuthWebSecurity.VerifyAuthentication(Url.Action("ExternalLoginCallback", new { ReturnUrl = returnUrl }));

If you query result.ExtraData[“username”], that contains the user’s email in it.

4. For Microsoft, it is a nightmare, I find DotNetOpenAuth didn’t even include the scope “wl.emails” in their request at all. I am disappointed, but it is not the end of the world.

I am trying to create a Custom Authentication Client, to retrieve Microsoft emails.

First, create a class MicrosoftScopedClient and implement IAuthenticationClient interface. You must implement two methods of that interface.

    public class MicrosoftScopedClient : IAuthenticationClient
    {

        public void RequestAuthentication(HttpContextBase context, Uri returnUrl)
        {
  
        }

        public AuthenticationResult VerifyAuthentication(HttpContextBase context)
        {
        }
    }

The next step is to build the authentication url in method “RequestAuthentication()”,

public void RequestAuthentication(HttpContextBase context, Uri returnUrl)
        {
            string url = baseUrl + "?client_id=" + clientId + "&redirect_uri=" + HttpUtility.UrlEncode(returnUrl.ToString()) + "&scope=" + HttpUtility.UrlEncode(scope) + "&response_type=code";
            context.Response.Redirect(url);
        }

Then I build VerifyAuthentication() method to receive the authentication code and send requests to obtain the access_token, and then use the access_token to request for the profiles.

public AuthenticationResult VerifyAuthentication(HttpContextBase context)
        {
            string code = context.Request.QueryString["code"];

            string rawUrl = context.Request.Url.ToString();
            //From this we need to remove code portion
            rawUrl = Regex.Replace(rawUrl, "&code=[^&]*", "");

            IDictionary userData = GetUserData(code, rawUrl);

            if (userData == null)
                return new AuthenticationResult(false, ProviderName, null, null, null);

            string id = userData["id"];
            string username = userData["email"];
            userData.Remove("id");
            userData.Remove("email");

            AuthenticationResult result = new AuthenticationResult(true, ProviderName, id, username, userData);
            return result;
        }

After I have done the work of building the MicrosoftScopedClient, I need to register it in the AuthConfig.cs, now we can feel free to pass any scopes there =)

            OAuthWebSecurity.RegisterClient(new MicrosoftScopedClient(ConfigurationManager.AppSettings["Microsoft.ClientId"].ToString(),
                ConfigurationManager.AppSettings["Microsoft.Secret"].ToString(),
                "wl.basic wl.emails"
                )
                , "Microsoft", null);

Below is a full copy of the MicrosoftScopedClient,

using DotNetOpenAuth.AspNet;
using DotNetOpenAuth.AspNet.Clients;
using DotNetOpenAuth.Messaging;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.IO;
using System.Linq;
using System.Net;
using System.Text;
using System.Text.RegularExpressions;
using System.Web;

namespace MicrosoftClient.Filters
{
    public class MicrosoftScopedClient : IAuthenticationClient
    {
        private string clientId;
        private string clientSecret;
        private string scope;

        private const string baseUrl = "https://login.live.com/oauth20_authorize.srf";
        private const string tokenUrl = "https://login.live.com/oauth20_token.srf";

        public MicrosoftScopedClient(string clientId, string clientSecret, string scope)
        {
            this.clientId = clientId;
            this.clientSecret = clientSecret;
            this.scope = scope;
        }

        public string ProviderName
        {
            get { return "Microsoft"; }
        }

        public void RequestAuthentication(HttpContextBase context, Uri returnUrl)
        {
            string url = baseUrl + "?client_id=" + clientId + "&redirect_uri=" + HttpUtility.UrlEncode(returnUrl.ToString()) + "&scope=" + HttpUtility.UrlEncode(scope) + "&response_type=code";
            context.Response.Redirect(url);
        }

        public AuthenticationResult VerifyAuthentication(HttpContextBase context)
        {
            string code = context.Request.QueryString["code"];

            string rawUrl = context.Request.Url.ToString();
            //From this we need to remove code portion
            rawUrl = Regex.Replace(rawUrl, "&code=[^&]*", "");

            IDictionary userData = GetUserData(code, rawUrl);

            if (userData == null)
                return new AuthenticationResult(false, ProviderName, null, null, null);

            string id = userData["id"];
            string username = userData["email"];
            userData.Remove("id");
            userData.Remove("email");

            AuthenticationResult result = new AuthenticationResult(true, ProviderName, id, username, userData);
            return result;
        }

        private IDictionary GetUserData(string accessCode, string redirectURI)
        {
            string token = QueryAccessToken(redirectURI, accessCode);
            if (token == null || token == "")
            {
                return null;
            } 
            var userData = GetUserData(token);
            return userData;
        }

        private IDictionary GetUserData(string accessToken)
        {
            ExtendedMicrosoftClientUserData graph;
            var request =
                WebRequest.Create(
                    "https://apis.live.net/v5.0/me?access_token=" + EscapeUriDataStringRfc3986(accessToken));
            using (var response = request.GetResponse())
            {
                using (var responseStream = response.GetResponseStream())
                {
                    using (StreamReader sr = new StreamReader(responseStream))
                    {
                        string data = sr.ReadToEnd();
                        graph = JsonConvert.DeserializeObject(data);
                    }
                }
            }

            var userData = new Dictionary();
            userData.Add("id", graph.Id);
            userData.Add("username", graph.Name);
            userData.Add("name", graph.Name);
            userData.Add("link", graph.Link == null ? null : graph.Link.AbsoluteUri);
            userData.Add("gender", graph.Gender);
            userData.Add("firstname", graph.FirstName);
            userData.Add("lastname", graph.LastName);
            userData.Add("email", graph.Emails.Preferred);
            return userData;
        }

        private string QueryAccessToken(string returnUrl, string authorizationCode)
        {
            var entity =
                CreateQueryString(
                    new Dictionary {
						{ "client_id", this.clientId },
						{ "redirect_uri", returnUrl },
						{ "client_secret", this.clientSecret},
						{ "code", authorizationCode },
						{ "grant_type", "authorization_code" },
					});

            WebRequest tokenRequest = WebRequest.Create(tokenUrl);
            tokenRequest.ContentType = "application/x-www-form-urlencoded";
            tokenRequest.ContentLength = entity.Length;
            tokenRequest.Method = "POST";

            using (Stream requestStream = tokenRequest.GetRequestStream())
            {
                var writer = new StreamWriter(requestStream);
                writer.Write(entity);
                writer.Flush();
            }

            HttpWebResponse tokenResponse = (HttpWebResponse)tokenRequest.GetResponse();
            if (tokenResponse.StatusCode == HttpStatusCode.OK)
            {
                using (Stream responseStream = tokenResponse.GetResponseStream())
                {
                    using (StreamReader sr = new StreamReader(responseStream))
                    {
                        string data = sr.ReadToEnd();
                        var tokenData = JsonConvert.DeserializeObject(data);
                        if (tokenData != null)
                        {
                            return tokenData.AccessToken;
                        }
                    }
                }
            }

            return null;
        }

        private static readonly string[] UriRfc3986CharsToEscape = new[] { "!", "*", "'", "(", ")" };
        private static string EscapeUriDataStringRfc3986(string value)
        {
            StringBuilder escaped = new StringBuilder(Uri.EscapeDataString(value));

            // Upgrade the escaping to RFC 3986, if necessary.
            for (int i = 0; i < UriRfc3986CharsToEscape.Length; i++)
            {
                escaped.Replace(UriRfc3986CharsToEscape[i], Uri.HexEscape(UriRfc3986CharsToEscape[i][0]));
            }

            // Return the fully-RFC3986-escaped string.
            return escaped.ToString();
        }

        private static string CreateQueryString(IEnumerable<KeyValuePair> args)
        {
            if (!args.Any())
            {
                return string.Empty;
            }
            StringBuilder sb = new StringBuilder(args.Count() * 10);

            foreach (var p in args)
            {
                sb.Append(EscapeUriDataStringRfc3986(p.Key));
                sb.Append('=');
                sb.Append(EscapeUriDataStringRfc3986(p.Value));
                sb.Append('&');
            }
            sb.Length--; // remove trailing &

            return sb.ToString();
        }

        protected class ExtendedMicrosoftClientUserData
        {
            public string FirstName { get; set; }
            public string Gender { get; set; }
            public string Id { get; set; }
            public string LastName { get; set; }
            public Uri Link { get; set; }
            public string Name { get; set; }
            public Emails Emails { get; set; }
        }

        protected class Emails
        {
            public string Preferred { get; set; }
            public string Account { get; set; }
            public string Personal { get; set; }
            public string Business { get; set; }
        }
    }
}