Archive for the ‘MS Technologies’ Category

A more advanced Survey List in SharePoint 2013

July 5, 2016 Leave a comment
Hi everyone!

My customer has asked for a solution to evaluate developped applications by the end users. A solution based on Survey template was proposed but the default template is just a form with the questions as fields. The interface is not convivial if we have a long survey!

Let us assume that we have a survey with three questions. The following pictures shows the default NewForm:


Enter a caption

What is the difference between a custom list and a survey then? Some interesting features can be cited:

  • The stats part to evaluate the survey.Surevey12
  • The branching feature in the survey that allows to jump to a given step based on what is chosen by the user at the current step.Surevey13

Now, let us develop a more advanced survey using just javascript! We want to achieve these objectives:

  • View just one question at a given step
  • Two buttons to access to the next or the previous question
  • View the Finish button once the user has finished the survey
  • Confirm the answers before saving

3 libraries will be used for different purposes:

  • SP.js for notifications.
  • SPUtilies.js for hiding and showing the questions.
  • JQuery for SPUtilities.

Simply we will write a custom html file that will be referenced into a Content Editor WebPart added on the page NewForm.apsx of your survey:


Here is the content of the Quiz.html:

<!DOCTYPE html>

<html lang="en" xmlns="">
    <!-- Javascript files references -->
    <script type="text/javascript" src="/JS/jquery-1.11.2.min.js"></script>
    <script type="text/javascript" src="/JS/sputility.min.js"></script>
    <script type="text/javascript" src="/_layouts/15/sp.runtime.js"></script>
    <script type="text/javascript" src="/_layouts/15/sp.js"></script>
<meta charset="utf-8" />
    <!-- A table containing the Next and Previous buttons -->
<table class="ms-rteTable-default" cellspacing="0" style="width: 100px;">
<td  style="width: 100px; height: 20px;">
<div id="Previous"><<<</div></td>
<td  style="width: 100px; height: 20px;">
<div id="Next">>>></div></td>
<script language="javascript" type="text/javascript">
        //A variable to keep the current position in the survey
        var i = 0;
        //An Arry to store the list of the questions fields
        var fieldsArray = [];
        //This functions is invoked when we click on Finish Button
        function PreSaveItem() {
            var fields = SPUtility.GetSPFields();
            var message = 'Are you sure to save these answers ?\n';
            for (fieldName in fields) {
                message = message + fieldName + ' : ' + 
                SPUtility.GetSPField(fieldName).GetValue() + '\n';


            if (confirm(message)) {

                return PreSaveAction(); // allow save form...
            //Notify the user
            function PreSaveAction() {
                SP.UI.Notify.addNotification('Your Answers are being saved...', 
                return true;
            $(document).ready(function () {
                //Hiding the form toolbar
            var fields = SPUtility.GetSPFields();
            for (fieldName in fields) {   
                //Get the questions list
                //Hide all question
                //Show the first question
        //Next handler
            $("#Next").click(function () {
                //Hide the current question
            if (i < fieldsArray.length-1) 
           //Show the form toolbar when we reach the last question                 
            //Show the next question             
            //Previous handler         
           $("#Previous").click(function () 
             //Hide the current question             
           if (i > 0) {
            //Show the previous question


Let us see the results!

Oh yes, the buttons Next and Previous allow you to navigate between the questions:

First Question:


Second Question:


Thrid Question:


Now if we clic on the Next button the Finsih button is shown:


Before saving the answers a confirm window is launched to confirm the answers:


Finally, a notification label dispalys the message of saving the answers:


Ok that’s all! Hope it helps!



Certificate validation error can cause broken images in reporting services reports

January 16, 2016 Leave a comment


I am so happy to meet WordPress readers again!

In some reports, images components are configured as external links but using https protocol. The links are working fine using different browsers and the public certificate is validated.

However using external links with http protocol are working correctly while executing reports. After googling the issue, I could find some answers speaking about Configuring the Unattended Execution Account to allow access to external ressources from reports like UNC files.

In my case, no proxy, no authentication are required to access web ressources.

On the reporting server, I opened an exetrnal link in the browser and bingo!!!!; my certificate could not be validated. I was obliged to install the root certificate on my server to have the certificate validated.

After that, the images were displayed correctly finally!

Hope it helps!

How to set default smtp addresses for Active Directory contacts using Powershell

May 15, 2014 Leave a comment

Hi again,

We are working currently on a critical migration project from a Windows 2003 Platform to Windows 2008/2012/R2. We had also to migrate Exchange 2003 to Exchange 2010. We had encountered some troubles and everything is working fine after some resolutions (ElhamdouliLLah!).

One of the encountered problems concerned a big number of Exchange 2003 contacts (created in Active Directory) that could not be opened in Exchange 2010. The cause is that the default primary SMTP address of these contacts was not set. Consequently, it was not possible to send mails to those contacts. We had more than 700 contacts to update. In our 2012 Domain controlers we had to modify the proxyAddresses attribute for each contact which contains initially two addresses : X400 and smtp.

To set a default SMTP address, the “smtp” keyword has to be changed by the “SMTP” word.

Let us suppose that we have an OU containing all our contacts and named “EXP Contacts” in the “” domain. You can find some PS scripts just for Active Directory Users. The idea is the same but with the contacts things are a bit different.

The Powershell script to execute on a domain controller or an Exchange server after importing Active Directory modulle is as follows :

$EXPOUPath=”ou=EXP Contacts , dc=contoso, dc=com”;

$EXPContacts = Get-ADObject -Filter ‘objectClass -eq “contact”‘ -searchbase $EXPOUPath -Properties *

foreach ($EXPContact in $EXPContacts)

foreach ($EXPContactPrxAddress in $proxyAdresses)
if ($EXPContactPrxAddress -match “^smtp”)
Set-ADObject CN -Remove @{ProxyAddresses=$EXPContactPrxAddress}
Set-ADObject CN -Add @{ProxyAddresses=$EXPContactDefaultAddress}

Hope it helps.

SCCM ConfigMgr 2012 Support Center Viewer, a useful tool, is available!

February 5, 2014 Leave a comment

system center

Hi SCCM Geeks!

Troubleshooting SCCM is not a so easy work. SCCM is centralized (SCCM Site roles) and distributed (managed clients). So it is not obvious to find the source of your issue.

SCCM generates lot of log files on the servers and the clients and studying them is so difficult and time consuming.

Microsoft offers us this great tool : SCCM ConfigMgr 2012 Support Center Viewer.

This tool allows you to collect and consolidate logs to use for troubleshooting purposes and  simulate some client actions like requesting client policies.

You can find further information in this blog.

Hope it helps!

After Forefront family, Inforpath will also die!

February 1, 2014 Leave a comment


Hi again!

Strange decisions Microsoft is taking but not really, this time, about InfoPath technology as we are working on a Next Forms Technology!

Just few hours ago, Microsoft announced that this beautiful technology is knowing its end of life!

This decision concerns the Client Inforpath and InfoPath Service Applications on SharePoint Server 2013 and Office 365 SharePoint OnLine.

Just wait for the SharePoint Conerence Event coming next month inchaEllah to discover the Next InfoPath Technology.

Categories: SharePoint Tags:

Advanced Hyper-V replication configuration

August 9, 2013 Leave a comment


In the last post, I presented the Capacity Planner for Hyper-V Replica and in the documentation, I discovered that this tool can suggest a value for the parallel machines number to be transferred.

Rapidly, I asked Google about this parameter and bingo! A very nice article from Microsoft that describes more parameters to configure :

  • DisableCertRevocationCheck
  • MaximumActiveTransfers
  • ApplyVHDLimit
  • ApplyVMLimit
  • ApplyChangeReplicaDiskThrottle

All these parameters can be configured through the registry.


Capacity Planner for Hyper-V Replica; a long story from SCCM!

August 7, 2013 1 comment

Hi Geeks,

For a customer who has about 1500 users, I have designed a SCCM 2012 Platform using a single primary site since there is no a subordinate important site (to use it as secondary site or another primary site) with the these elements :

  1. A site server on a DL 360 G7
  2. A site system server with duplicated roles on a DL 360 G7
  3. 2 SQL Servers configured used Always On feature on 2 DL 360 G7

All right for 1500 users, the proposed architecture is highly available. However, the customer has changed his opinion: The SCCM Is so critical for him ans desires to get it on the Secondary site.

My challenge was with the same servers, I had to find a solution since SCCM 2012 does not support Disaster Recovery capabilities.

So I have thought a bout virtualization to offer :

  • High availability through a Hyper-V cluster
  • Disaster Recovery capabilities through Hyper-V Replica

The architecture has changed and the following schema describes the involved elements :


  • 2 servers used as Hyper-V Cluster Nodes. Each node can host two machines : SCCM (a primary site server), SQL ( configured also as a site server with some duplicated roles)
  • 1 server as SAN (Yes!). The cluster was based on SMB 3!
  • 1 server as Hyper-V replica

Very nice! The designed architecture was deployed successfully (ElhamdouliLLah). However, I have encountered some issues with the Hyper-V replication that works fine locally but with big disruptions over the WAN.

My problem is I was not able to estimate the necessary ressources (WAN bandwidth especially) for my workload.

Fortunatly, Microsoft has released this great tool ; Capacity Planner for Hyper-V that can be downloaded from this link.


After configuring and running the tool, it is possible to consult a rich report that covers (from the tool documentation) :

1)      Virtual Machine:

The table lists a set of VMs and VHDs which were considered for capacity planning guidance.

2)      Processor

The table captures the estimated CPU impact on the primary and replica servers, after enabling replication on the selected VMs.

3)      Memory

The table captures the estimates memory requirements on the primary and replica servers, by enabling replication on the selected VMs

4)      IOPS

There are two tables in this section – one for the primary storage subsystem and the other for the replica storage subsystem.  The attributes for the primary storage subsystem are:

a)      Write IOPS before enabling replication – This captures the write IOPS observed across all the selected VMs for the duration of the run

b)      Estimated additional IOPS during initial replication – Once replication is enabled, the VHD is transferred to the replica server/cluster as part of the ‘Initial Replication’ (IR) operation which can be completed over the network. The IOPS required during this duration is captured in this row.

c)       Estimated additional IOPS during delta replication – Once IR completes, Hyper-V Replica attempts to send the tracked changes every 5 minutes. The additional IOPS required during this operation is captured in this row.

The attributes for the replica storage subsystem are:

a)      Estimated IOPS during IR – During the course of IR, the IOPS impacts on the replica storage subsystem is captured in this row

b)      Estimated IOPS when only the latest point is preserved – While enabling replication, customers will have an option to store only the recovery point or upto 15 additional recovery points (which are spaced at a 1 hour granularity). This row captures the IOPS impact when storing only the latest recovery point.

c)       Estimated IOPS impact when multiple recovery points are used – This row captures the IOPS impact when replication is configured to store multiple recovery points. Hyper-V recovery snapshots are used to store each recovery point. The IOPS impact is independent of the number of points.

5)      Storage

This section captures the disk space requirements on the primary and replica storage. The first table which captures the primary storage subsystem contains the following details:

a)      Additional space required on the primary storage: Hyper-V Replica tracks the changes to the virtual machine in a log file. The size of the log file is proportional to the workload “churn”. When the log file is being transferred (at the end of a replication interval) from the primary to the replica server, the next set of “writes” to the virtual machine are captured in another log file. This row captures the space required across all the ‘replicating’ VMs

b)      Total churn in 5minutes: This row captures the workload “churn” (or the writes to the VM) across all the VMs on which replication will be enabled.

The following metrics are reported on the replica storage:

a)      Estimated storage to store the initial copy: Irrespective of the replication configuration around additional points (latest vs storing more than one point), this row, captures the storage required to store the initial copy.

b)      Additional storage required on the replica server when only the latest recovery point is preserved: Over and above the storage required to store the initial copy, when replication is enabled with only the latest point, the tracked changes from the primary server are written to the replica VM directly. Storage (which is equal to the churn seen in a replication interval) is required to store the log file before writing to the replica VM.

c)       Additional storage required per recovery point on the replica server when multiple recovery points are preserved: Over and above the storage required to store the initial copy, each additional recovery point (which is stored as Hyper-V snapshot on the replica server) requires additional space which is captured in this row. This is an estimate based on the total VHD size across all the VMs and the final size is dependent on parameters such as write pattern.

6)      Network

The network parameters are captured in the table. These are:

a)      Estimated WAN bandwidth between the primary and replica site: This is the input provided to the capacity planning tool.

b)      Average network bandwidth required: Based on the workload churn observed during the duration of the run, this row captures the average network bandwidth required to meet Hyper-V Replica’s attempt at sending the tracked changes every 5 minutes. This is a rough estimate as factors (which are not accounted by this tool) such as compression of the payload, latencies in the network pipe etc could impact the results.

c)       MaximumActiveTransfers: In a multi-VM-replication scenario, if the log file for each of the replicating VM is transferred sequentially, this could starve or delay the transmission of the change log file of some other replicating VM. On the other hand, if the change log file for all the replicating VMs are transferred in parallel, it would affect the transfer time of all the VMs due to network resource contention. In either scenario, the Recovery Point Objective (RPO) of the replicating VMs is affected. An optimal value for the number of parallel transfers is got by dividing the available WAN bandwidth by the TCP throughput of your link. The tool calculates the TCP throughput by replicating the temporary VM which is created and makes a recommendation for a registry key which is taken into account by Hyper-V Replica. It is worth noting that the value captures the number of parallel network transfers and *not* the number of VMs which are enabled for replication.

A great tool really!