Skip navigation
All Places > Getting Started > Blog
1 2 3 Previous Next

Getting Started

460 posts

Hello Connect members,

My name is Eric Harris and many of you may have seen my name somewhere here either on a post, Nintex Hangout videos or content with Nintex. I’m excited to be reaching out to you in 2019 on the community and wanted to bring you some great news as Nintex starts out the new year with a bang.




While many of you signup as members and visit here to post questions or find answers, there is a group who do more such as help moderate, and most importantly help set the atmosphere of the community by making other members feel comfortable here. As a representative of Nintex and a former community member, Nintex Champion and evangelist, I want to say thanks for all that you have done, and happy new year to each of you.


Over the past year, you may or may not have heard about a small project Nintex took on to revamp this community, make some enhancements and updates to make it easier to serve you and migrate to a different platform so we can keep growing together.


Well I’m pleased to announce that while a year is a long time, doing things right is always our aim and we are getting really close to launch time. Hope you’re excited as I am about it, if not, get excited because this is for and about you.  and we want your help to make sure the community continues to grow. Because this is your community, I want you to be part of this launch to make this the biggest technical community launch in the history of Nintex, and communities like it worldwide.


So what does this mean? This is the first of many updates I will be posting as we get ready for the launch, so definitely stay tuned. Also visit here often and do what you do best, post questions and answers right here. Also if you have any suggestions, ideas, complaints or whatever you want to call it, list them below in the comments section and I'll be sure we take note as we finish up the final launch steps. 

So I came across a problem with updating repeating section XML and was not able to find any conclusive answer on previous posts or answered questions so decided to share it in the hope that it would help someone else. 


We are building a Purchase order system to be the middleman between the user/requestor, approver and SAP. in this process we are calling budget data from SAP by Line item, which works great but the moment you update XML it replaces values in all entries for the specified Xpath item, i.e. the budget value is overwritten for all lines in your repeating section everytime you loop through the next line item and check budget. 


In one question I saw a reply where the user was advised to use the Index value used in the For Each loop to find which line item to call, this however failed. Why? because my index default value is 0 and not 1... this means the first update is trying to update line 0 which does not exist.



I then created a simple number variable with Default set to 1, and a Math Operation action to increment the counter at the end of each loop by 1, and Booyah! I could probably have changed the Index default value to 1 and still used that but as a standard my Indexes always have a default value of 0, plus it would be easier for the next person taking over maintenance on SharePoint and my workflows to figure out why I'm incrementing a counter, they might not realize my Index default is 1 and this could cause problems for them, we want to create solutions that are easy to hand over.


For those who do not know you need to connect your repeating section to a multiple lines Plain text field in your list, this makes working with the xml data in it much simpler. 



Once you've done this you have to write the Updated XML (which is stored in a Workflow variable of type Multiple Lines of text) back into your list column that your repeating section is connected to. This must be done inside the For Each loop:


I Recently had a request to get a Project Manager name that is linked to a project when the user selects a project number from a drop-down control.


I managed to get the Project managers name using this formula : lookup("Project List", "Project_x0020_Number", parseLookup(Project), "Project_x0020_Manager"). The only issue was I was also getting the ID along with the name. e.g.  234;#Joe Soap.


So, to fix this I used a regular expression to remove the ID from the front of the Project Managers name.


Steps I took:

1. I created a Form Variable. The formula I used to get the project managers name is : lookup("Project List", "Project_x0020_Number", parseLookup(Project), "Project_x0020_Manager")


"Project List" - is the name of the list that has the project number and project managers name in it.

"Project_x0020_Number"  - is the Project number for the Project. 

"Project" is the name of my drop down control that displays the list of Project numbers in it.

"Project_x0020_Manager" - is the name of the column in the "Project List" list that has the Project managers names in it.



2. I added a Calculated Value control to my Form. I used the regular expression in the formula to remove the characters before the #. 

This is the formula : replace(GetPMName, ".*#", "")


"GetPMName" - is the name of the form variable I created that holds the Project manager name with the ID attached to the front. e.g. 234;#Joe Soap.

".*#", ""  - is the expression I used. It removes everything before the # and replaces it with a empty string.



End result : Joe Soap.


Hope this helps someone. 

Interested in free Nintex training? Check out our latest intro course made for complete beginners to Nintex. Even if you've never heard of a workflow or automation, you can dive right into this course to gain a fundamental understanding of the Nintex Platform.


By learning the basics of lists, forms, and workflows, you can build a foundation on which to progress your skills through additional courses or other resources. Easy to follow videos and instructions will guide you every step of the way.


Visit  to enroll in this course today. Happy learning, everyone! 





Counting Records

Posted by graham Oct 12, 2018

I wanted to be able to do a Group Count on a set of items in a library (I actually wanted to save the values and counts to a list so that I could attach a chart control – but that’s another story).

There is no way of doing a Group By in the Query List task, so I had to come up with this.

Retrieve all my key values into a collection, then create a list of those unique key values

Iterate through the unique values and remove them from the original list – the number of items matching the key value is the number of items removed from the collection – the difference between the counts before and after the deletion.

In pseudocode, the process is this

QueryList : get all ‘key values’ in range into collection AllKeys

Collection: Remove duplicates from AllKeys into collection UniqueKeys

ForEach string Key in UniqueKeys

Collection: Count AllKeys into number Total

Collection: Remove by value Key from AllKeys

Collection: Count AllKeys into number Count

Math: Total Count into number KeyCount

{do something with Key and KeyCount}


There are many requirements where Approvers need to upload attachments on task form, which need to be uploaded on main item instead of task form.

I have achieved it using REST and JS.


Major Steps :

1. Create RichText Control on Text Form to show attachment and add attachment button.

2. Add JS which use REST API to add attachments


Requirement Result Screen :


On click of Show Attachment button, it loads attachments of Related item with add attachment link as shown below

on Add attachment it opens below screen and we can upload attachment by selecting file.



1. Drag a Task on workflow. Click on Edit Task form.

2. Add a Rich Text Control on form as shown in below screen

3. Edit Properties of Rich Text add below HTML in Rich text as shown below 

<input id="btnAttach" ="return checkSPLoad('');" style="width:180px;" type="button" value="Show Attachments" />

<div id="divAttachs"></div>

<div id="addAttachmentDiv" style="display:none;">Select a file<br />
      <strong>Name </strong>

      <input class="attachmentButton" id="attachmentButton" multiple="multiple" name="attachmentButton" ="attachFile(this);" type="file" />


<div class="nf-attachmentsLink" id="idAttachmentsLink" ="showAttch();" style="height:17px;display:none;">

      <img  />

        <a class="ms-addnew" href="#">Add Attachment</a>



4. Click On Save.

5. Select Form Setting in ribbon and add javascript.

6. Paste below script in custom javascript section.

var pollSP;  var hostweburl='';
var appweburl='';
var listname='ListName';//Include list name 
var itemid=ID;  //Include reference of ID of item.

var file;
var contents;
var itmUrl='';  
function checkSPLoad(callType){
    if (clientContext){ 
        hostweburl = decodeURIComponent(getQueryStringParameter("SPHostUrl"));
        appweburl = decodeURIComponent(getQueryStringParameter("SPAppWebUrl")); 
        var layoutsPath = "/_layouts/15/"; 
        var scriptbase = appweburl + layoutsPath; 
               NWF$.getScript(scriptbase + "SP.js", function () {
                            NWF$.getScript(scriptbase+ "SP.RequestExecutor.js", execCrossDomainRequest);
                  NWF$.getScript(scriptbase + "SP.js",   function () {
                            NWF$.getScript(scriptbase+ "SP.RequestExecutor.js", execCrossDomainGetRequest);
    function execCrossDomainRequest() {  
        var contents2 = _arrayBufferToBase64(contents);
        var executor = new SP.RequestExecutor(appweburl);             
        var digest =     NWF$("#__REQUESTDIGEST").val();  
            url:appweburl +   
            "/_api/SP.AppContextSite(@target)/web/lists/getbytitle('"+listname+"')/items("+itemid+")/AttachmentFiles/add(FileName='"                   +"')?               @target='" +      hostweburl + "'",   
            method: "POST",   
            body: contents2 ,  
            binaryStringRequestBody: true,
            contentType: "application/json;odata=verbose",         
            headers: {            "X-RequestDigest": digest, "Accept": "application/json; odata=verbose"},                      
            success: function (data) {  
                   execCrossDomainGetRequest();    NWF$('#idAttachmentsLink').show();
            error: function (err) {  
                var data=JSON.parse(err.body);
 function execCrossDomainGetRequest() {  
       var executor = new SP.RequestExecutor(appweburl);             
             url:     appweburl + "/_api/SP.AppContextSite(@target)/web/lists/getbytitle('"+listname+"')/items("+itemid+")/AttachmentFiles?@target='" +   
                         hostweburl + "'",   
             method: "GET",   
             headers: { "Accept": "application/json; odata=verbose" },                   
                success: function (data) { 
   error: function (err) {  
    var data=JSON.parse(err.body);
     ; NWF$('#btnAttach').show(); 
function getQueryStringParameter(paramToRetrieve) {   
    var params =   
    for (var i = 0; i < params.length; i = i + 1) {   
        var singleParam = params[i].split("=");   
        if (singleParam[0] == paramToRetrieve)   
            return singleParam[1];   
function showAttch()

function parseAttachment(vdata)
 var html='';
 var data=JSON.parse(vdata.body);
 var items=data.d.results;
  html+='<a href="'+itmUrl+'//'+items[i].FileName+'" target="_blank">'+items[i].FileName+'</a><br>';
function f"font-size: 12px;">    contents =;
function attachFile(event) {
    var i = 0,
    files = event.files,
    len = files.length;
 if (files.length > 0) {
        file = files[0];
        fileName =;
        var reader = new window.FileReader();
        reader. f"font-size: 12px;">        reader. = function(event) {
            console.error("File reading error " +;
    return false;
function _arrayBufferToBase64(buffer) {
    var binary = '';
    var bytes = new window.Uint8Array(buffer);
    var len = bytes.byteLength;
    for (var i = 0; i < len; i++) {
        binary += String.fromCharCode(bytes[i]);
    return binary;

 Note : Update list name and id(Add reference from Item section) in top of script.

7. Save and close form.


Publish workflow and test.


Happy Nintexing.

Hello Everybody,


In calculated field have formula "userProfileLookup(Current User,"PreferredName")" is it possible to somehow transfer this information to field Single Line Textbox.

I already try to set this same connected field also make this solution Can't populate text control with calculated value 

but it doesn't collect any data to this field.

Is it some different solution for this ?

Dear all:

I have been very confused about this topic for months:

  • How to navigate from a parent form to child form and come back?
  • How to navigate from a child form to a new one child form keeping the parent link?
  • How to validate child form data before come back in both cases?

I think I have now a good approach to do it without javascript. 


How buttons work

First of all, I would like to describe how buttons are working under my understanding, in the case of validation and redirection:

Save and submit button / Save button:

Both buttons have redirection parameter. You can build the URL dynamically, using a list field value and run-time functions. You can't use calculated fields or control values, this is a very important point.

In the case of validation, both buttons can validate data, including list required fields and validation rules in the form.


Save and continue button:

Save and continue button hasn't redirection parameter. After saving data, the action is refresh page and you are in the same form again. 

In the case of validation, only list required fields are validated, NOT the rules in the form.


Cancel Button

You can redirect as Save and Submit button, Data will be not saved and of course, there is not any validation.


Keeping in mind that only Save & Submit button allow us the whole validation, and the limitations with redirection building (as far as I know), here is the solution:


Parent ID to new child form


To relationship parent and child, you need to pass the parent ID when you redirect to child form. I assume the parent element is previously saved (this is mandatory for my strategy), so parent ID exists in the ID field. The ID argument is included at the end of the URL string, something like that:



NewForm.aspx?: To call a new child form

ParentID: the parameter name to pass in the query string

ID: The element property, in this case, the ID field


I use a button (Save and Submit) called "Nuevo Hijo" to do it in my Parent form, using the dynamic URL in the redirection (sorry in Spanish):


(As shown in the picture, the child list is included in the parent form)


I would like to remark that:

  • Data in the parent form will be saved and validated before navigation
  • The parent ID will be used to come back and to record it into child list element to make the relationship between parent and child.


Parent ID to an existing child form


In this case, you can not use a button. Instead of it, you select "Edit" from the menu of the selected element in the included Sharepoint list:


You can't pass the Parent ID using this method, but don't worry because you will have recorded it into child element previously.


Working into child form


Storing the Parent ID

The first thing to do into the new child form is to get the Parent ID from the URL string, using a calculated field:



  • Formula: you get the parent id value from the URL string using the fn-getquerystring function
  • Connected field: the field name into the child list to store the Parent ID
  • Formula calculation ONLY in the new mode. Because if you are editing the child you have the Parent ID  recorded yet and you will not send from Parent form again.


Save and Submit Button and return to the parent form


When you finish the edition of the new child, perhaps you want to come back to the Parent form. To do it, we use a save and submit button, including a dynamic redirection URL string as:


fn-if(Is New Mode,Site URL/Lists/MyParent/EditForm.aspx?ID=fn-GetQueryString(ParentId),Site URL/Lists/MyParent/EditForm.aspx?ID=Parent_Id)


First of all, there is a conditional function that defines if the form is in a new mode or not, checking the value of Is New Mode Function.

  • In case of a new mode,  you define the URL taking the Parent ID who is the URL passed from the Parent form fn-GetQueryString(ParentId)
  • In case of not a new mode, you define the URL taking the Parent ID from the Parent_Id field value (that you have stored using the calculated field when you was in new mode)

As you are using Save & Submit button, first of all, data validation happens, then saving and finally redirect to the parent form


Save and Submit Button and open a new child form (keeping the parent ID in the URL string)


If you want to open a new child form, you only need to change the destination URL in the previous formulae:


fn-if(Is New Mode,Site URL/Lists/MyChild/NewForm.aspx?ID=fn-GetQueryString(ParentId),Site URL/Lists/MyChild/NewForm.aspx?ID=Parent_Id)


You can repeat this action as much as you want in order to create multiple child elements and finally come back to the parent form.


That's all. Any comments are welcome, of course, this is only an approach, for sure will be better solutions.


Have a nice workflow day! 

This works with on-prem Nintex Workflow 2013

Problem:  Upon archiving a document I use workflow to update the date/time field of when the archive happened.  When user wants to un-archive the document, I need to clear the date field to display as empty (or null).



  1. Create variable vDateArchivedDate as Date and Time with blank default date
  2. Use Convert value action input "1/1/0001" and store in variable vDateArchivedDate
  3. Use Set field value to update the date column with workflow data > vDateArchivedDate




  • Not Applicable


Bug Fixes


Source Code


LAR, doce LAR

Posted by technunes Jul 9, 2018

If you are wondering what “lar, doce lar” means, this is the exact Portuguese translation for “home, sweet home”. Obviously, the context of this article has nothing to do with the pleasure of being returning home, but the he Lean, Automate and Robotize (LAR) approach can and should be just as sweet and can also avoid some painful experiences in the future.

Before we get deeper into the topic, bear in mind that LAR is the result of my own experiences around automating business processes, and it is not a known term in the community. While I do not have the ambition to push one more acronym into our work routine, I do hope, for the sake of security, efficiency and cost reduction, the approach gets implicitly adopted as a best practice.

There is of course a bunch of articles mentioning that the processes should be optimized before being automated, but with the hype around Robotic Process Automation (RPA), we need to revisit the subject as RPA can also offer a risk for your process automation initiative. After all, your competitive edge hinges not on whether you automate, but how you do it.

So, let’s get some definitions straight:

·        Lean (in a service context) is a process optimization methodology that focuses on improving the effectiveness and efficiency of a process by eliminating activities that do not add value to the customers and the product.

·        Automation of business processes is a technology-enabled approach that is performed to achieve digital transformation or to increase service quality or to improve service delivery or to contain costs.

·        Robotic Process Automation is a computer software or a “robot” to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems.

An analogy that I like to use when comes to business processes is that a business process is like a road through the mountains and, due to people, systems and procedures, it is full of curves and ups and downs.

The goal of the process optimization is about creating a bridge that will connect the start to the end of the process in the most efficient way. 

In a complex scenario however, this business process can involve ERP systems, legacy systems, office tools and many more. Altogether, this can also offer some change management challenges.

In addition, business will depend on IT to build this bridge and what supposed to be simple, can turn up to be a very sophisticate project, leading to delays, huge implementation costs and frustrations.

As Mike Fitzmaurice would say: in the best of all possible worlds, we would be able to get the requester and the developers a mind meld, but life is not like star trek and mind meld is not real.

The RPA can solve this problem by keeping the process as it is, but let robots to drive the “cars”. Robots can at some level, respond to events just as humans would.

This is all great, but more often than we think, a hybrid approach would be the most appropriate. While we all know that the integrations with an ERP system may bring the complexity of a project to high levels, and many legacy systems does not offer a proper API that can easy out a possible integration, there are parts of the process that can and should be optimized in a proper way. Nintex technologies empowers users to achieve just that, avoiding the IT bottle neck.

The RPA approach should not be considered the answer for everything and a more efficient approach would be the combination of the best of both worlds. For example, by replacing the excel spreadsheet with a proper digital form (mobile enabled) and avoiding the back-and-forth of emails through a proper workflow around a form. 

Lean and automating are very connected and they overlap in many levels because it often involves a process optimization in a digital format. The initiative of automating a process, like introducing a digital workflow, most likely show opportunities to enhance the process itself. In the same way, the initiative of enhancing a process will frequently result on some sort of automation adoption. The robotizing part is connected with the automation because it offers a mechanism of interaction with digital systems in an automatic way, but does not attempt to substantially change the AS-IS process. It is also important to notice that LAR is part of the continuous improvement process defined in Lean.

The fact that RPA does not attempt to change the AS-IS process, gives the idea that the Robotic Process Automation is always an easier, faster and more cost efficient than any other automation approach. This is however not always the case.

What if building the proper bridge did not require IT muscle? That’s where Nintex Technologies come in place.

The scenario below shows the process around receiving an offer, creating a P.O. and receiving an invoice.

·        The requester asks the supplier to send an offer.

·        The supplier sends an offer (email text, pdf, word, excel or even PowerPoint).

·        The offer is validated by the requester.

·        The offer is sent to the internal responsible to create the P.O.

·        The P.O. is created in SAP and sent to the supplier.

·        The supplier will then send the invoice.

This could eventually been addressed by EDI (electronic data interchange), B2B systems or API calls, but it is difficult to expect that all business partners from different sizes, regions and segments can easily adopt such approach. The Robotic Process Automation would then be good candidate to automate this process.

However the “Read document/ extract data” part of the process can be quite challenging because RPA tools does do magic. PDF is a business standard document format and it would be fair to demand the offers to be sent on this format. With that in mind, a robot can be configured to read the offer from the offer repository and using some intelligent OCR and text analysis, figure out who is the supplier, the total amount for the offer and basically transform all unstructured or semi-structured data into structure data. The problem here is that the PDF contained the offer can be formatted in several different ways and it is unlikely that the RPA tool would be able to get the data in a trustful way, unless it attempts to use a 3rd party product like Abbyy to achieve that, which would raise the complexity and the operational cost.

So, let’s revisit the process and apply LAR into it.

At the very start of the process, there is an exchange of email between the requester and the supplier in order to get the offer. The supplier sends an email with the offer attached. The offer document is in a semi-structured format and, as we have seen earlier, it implies some challenges to our automation with the RPA tool.

A way to optimize the process would be to eliminate the email exchange and establish a way to collect the structured data.

Using Nintex Workflow cloud, a business user can easily create an online responsive form to handle the offer submission and the workflow to handle the approval form.

The offer form can be divided into sections. While the Supplier and Offer sections collects the master data for the offer document, the Info and Signature sections support the overall process.

The workflow engine would consume this request and handle the approval of the offer. Additionally it is possible to connect to the cognitive services for text analysis and data classification. 


In our example, we capture the sentiment of the comment provided in the Info section. In case the comment is something positive like “20% discount is already applied”, the process will continue in one path of the workflow. If a negative message is provided, like “the offer is only valid until the end of this week”, then the process will follow another path in the workflow.

Using Nintex DocGen, a well formatted document is generated. This document can be the trigger for the RPA robot to start the P.O. process. Alternatively, the Robot can be called directly from the workflow as the orchestrator exposes a web API.

The overall architecture is represented on the image below:



A solution like the one described in this article is achieved with a no-code approach and can be delivered by business users in a matter of hours. RPA is an excellent technology but should not be considered the answer for all the automation initiatives.

LAR will support the creation of the proper bridges when is feasible and let the RPA do what RPA is good at – to take the robot out of people.

Even a short workflow can generate quite a few variables. When you assign a variable to tasks, the names are presented in alphabetical order. The variables list itself displays the order in which they were created. This makes it difficult to find a particular one to delete for example while refactoring or just coming up for a name for a new variable.

It was such an exercise that prompted me to write the attached script which sorts the variables into order, either by Name or Type and Name.

After exporting the workflow to the filesystem, the script edits the workflow and always writes it to Sorted.NWF – so that the original file is preserved should you need to revert to it.

Trying to use pure xml defeated me, so I used the brute-force method of extracting the WorkflowVariables section and converting it to an xml object, sorting it and pasting it back into the workflow file.


Assume the script is in the same location as your downloaded .NWF file

.\Sort-WflowVariables.ps1 .\MyWorkflow.nwf

Will sort the variables in name order


.\Sort-WflowVariables.ps1 .\MyWorkflow.nwf -ByType

Will sort the variables first into their type and then into order of name

This can then be imported back into the designer where you will see the variables listed in the order requested.

As ever with scripts from the internet, please review its contents to ensure that you are happy to run it in your environment.


Ever wondered how to display SQL table inside your Nintex Form? Indeed, there is the “SQL Request” action, but it only allows you to show data from database as a dropdown, list of options, etc… and always – just a single column.

However there is an easy solution for that. The approach I am using includes usage of the “FOR XML” command in a SELECT statement (source). It is available in SQL Server starting from version 2008. It returns data from a query using an XML format, concatenated in a single row, in a single column. Perfect format to parse it!

Step by step

1. First prepare your SELECT query. Mine is for example:

  1. SELECT TOP(1) (SELECT name, lastname, email, role
  2. FROM users
  3. RIGHT JOIN roles ON roles.Id = users.roleId
  4. ORDER BY lastname ASC
  5. FOR XML) as datatable FROM users

With such statement I am sure, that I will receive just a single row and column, that will return the data in a proper XML format. Each row will be built using the following structure:

  1. <name>value</name><lastname>value</lastname><email>value</email><role>value</role><name>value</name><lastname>value</lastname><email>value</email><role>value</role>...

Put the query in the “SQL Request” control inside your form:

SQL Request action Nintex Forms

Set the field not to be visible. It is not going to be used directly.

2. Now add a “Calculated Value” control. It will be used to get the output from the “SQL Request” and parse it into a valid table. I am using the following formula to achieve it:

  1. '<table class="dataTable"><thead><tr><th>Name</th><th>Lastname</th><th>Email</th><th>Role</th></tr></thead><tbody>'+replace(replace(replace(replace(SQL REQUEST CONTROL NAME, '</role><name>', '</td></tr><tr><td>'),'\<\/(?!td|tr)[a-zA-Z]+\>\<(?!td|tr)[a-zA-Z]+\>','</td><td>'), '</role>', '</td></tr>'), '<name>', '<tr><td>')+'</tbody></table>'

It simply creates a ready to use HTML table. It replaces ending and starting XML tags to starting and ending <tr><td> tags (to mark start and end of each row).

3. Next define a CSS styles for your table. I used the following page to create a set of CSS: 


The table is ready to be shown:

HTML table out of SQL table

I hope this can be find useful for you.


In my job I’m doing lots of bits and pieces, tweaks enhancements etc to SharePoint solutions I've built using Nintex forms and workflows among other things. I thought to myself I should record the things I do in a central place rather than them getting lost in various emails, tickets, thoughts, phone calls etc.! I started with a simple excel spreadsheet but then thought, why not create a sharepoint list and basic form that I can quickly add to…





Why would I use it? To log things in case changes I make break later down the line, weekly meetings, to show what I'm doing \ done to my manager and for my Appraisal perhaps. It evolved a bit as well from my original idea. I added a status field allowing me to tag items as in progress, that I can complete or add comments to at a later date. The workflow updates the item with comments and closing date. It's also been shared with everyone and a simple page created to show todo and completed items.



 For reporting I've connected it to excel. My favourite simple and quick reporting method for all things SharePoint.



So this is a really simple solution using a basic list with a Nintex Form and Workflow attached. The buttons on the form control what the workflow will do.

List, Form and Workflow attached...





For a long time, we have been plagued by times in InfoPath forms jumping forward by an hour during daylight saving time after amending the metadata fields.

Investigation shows that the datetime value written back to the form contains a modifier telling SharePoint that the time is recorded as being GMT (and thus an hour needs to be added to make it ‘correct’ again).

Date field in InfoPath form


The solution was to use Nintex to create a workflow that rewrites the date fields after change to remove the ‘Z’ character.

The workflow could either run on any change – or fine tune it to run only when one of the affected fields change value

The action is to

  • Retrieve the field value from the xml document
  • Test if it contains a ‘Z’
    • If it does, write it back without the ‘Z’

As it runs immediately after a change, it is worth starting the workflow with a ‘Commit pending changes’

The first step is to read the xml document associated with the current item, so insert a Query XML task – ensure the source is the Current Item; return the result as Text, to a temporary string variable

In the Output, set ‘Process using’ to XPath and click the XPath Builder button and navigate through your form’s xml to the field

When you reach it, select it and click the Apply button and something like this will be returned


Now test strTest in a Run If

Within the Run If, add an Update XML task

You can copy/paste the field reference from the Query XML task

You then replace the node content with the item value for the field, replacing the ‘Z’ with a blank

Note that the FormatDate function returns the date/time in the same format as that stored within the xml.

Now repeat this for each affected field.

Filter Blog

By date: By tag: