NAVBaaS Git – Extention

Original URL…

NAVBaaS Git

NAVBaaS Git for Dynamics NAV & Dynamics 365 Business Central

NAVBaaS-Git for Microsoft Dynamics NAV

This extension provides the integration between your dockerized C/SIDE development environment and git.
The commands will take care of keeping your C/SIDE development environment in sync with your git repository so that you can focus on what’s really important: developing!

If you’re having trouble with the NAVBaaS.Git extension, please follow these instructions to file an issue on our GitHub repository:

Make sure to fill in as much information as possible and try to provide a reproduction scenario.

  • Git for Windows Download Link.
  • Windows or Windows Server with Docker for Windows installed Follow Instructions.
  • Your Dynamics NAV solution must at least be based on NAV 2016 or beyond.
  • PowerShell Module: Navcontainerhelper (will be installed and managed by the extension)
  • PowerShell Module: SqlServer (will be installed and managed by the extension)

This extension will add a number of settings to your Visual Studio Code User Settings, all prefixed with NAVBaaS.
When following the normal flows as described below there’s no need to adjust these settings manually.

  1. Install the extension and press reload so that the extension can be used.
  2. Run Visual Studio Code with Administrator privileges.
  3. Execute command: NAVBaaS: Go! command and follow the steps.
  4. Move your splitted objects (text files) to the modified folder. It is required to use the default file naming conventions, for example: TAB1, COD1, PAG1.
  5. Stage and commit your files.
  6. Execute command: NAVBaaS: Create Container command to create your dockerized C/SIDE development environment.
  7. Execute command: NAVBaaS: Sync. command to initially import all the objects from your git repository.

If you already have your solution in git, please be aware that by default, the extension expects the following two folders in the root:

  • modified
  • original

You can change this by adjusting the following user settings:

  • NAVBaaS.ModifiedFolder
  • NAVBaaS.OriginalFolder

Below you can find a list of all the commands together with a brief explanation.

NAVBaaS: Go!
Command used to initially setup the extension on your machine, it will perform the following actions:

  • Select your git repository.
  • Initialize the folder structure in your git repository.
  • Select your Dynamics NAV development license.
  • Install the required PowerShell modules.

NAVBaaS: Open Git Repository Folder
Opens your configured git repository folder.

NAVBaaS: Create Container
Command used to create your C/SIDE development environment in a Docker container. You will be prompted to enter the following information:

  • Container Name.
  • Docker Image Name on which your solution is based. For example Microsoft/dynamics-nav:2018-cu5-nl
  • Authentication Type: can be either NavUserPassword (recommended) or Windows.
  • Which database to use: Cronus or your own .bak file.

*Only when a container is successfully created it will be stored in your user settings under: NAVBaaS.Containers.
*Only containers created through this command can be used with the Sync. command.

NAVBaaS: Remove Container
Command to remove your container when you’re done with it.
Please always remove your containers through this command so that related settings, folders etc. can also be cleaned up.

NAVBaaS: Sync.
Command used to keep your git repository and Dynamics NAV development container in sync.
The command handles the following tasks in one go:

  • Determine if there are any conflicts. If that’s the case the synchronization will stop – conflicts should be handled manually by using a compare tool.
  • Import changed objects from your git repository to your NAV dev. container.
  • Export and commit changed objects (based on modified flag) from your NAV dev. container to your git repository.
  • Remove objects which are deleted from your git repository from your NAV dev. container.
  • Compile uncompiled objects.
  • Synchronize schema changes.

Please note that when modified objects are exported from your NAV dev. container, a fixed DateTime property will be set on the object.
This is done to prevent conflicts when multiple developers are working on the same file(s).

NAVBaaS: Sync. Preview
Command used to preview the synchronization of the container without performing the import, export or delete.

NAVBaaS: Sync. Schema Changes
Command used to perform the schema synchronization inside the container. This can be handy when you make changes to the schema through C/SIDE and saving the table with validation does not work.

NAVBaaS: Compile Objects
Can be used to compile objects inside the container.
This can be handy when you cannot compile an object through C/SIDE because of server side dll dependencies.

NAVBaaS: Compare
Can be used to make object comparisons, the following scenario’s are supported:

  • Compare an object from the original folder with an object from the modified folder.
  • Compare an object from the modified folder with an object from your database (inside the container).

If you need functionality before or after creating the container you can use the two following settings in your user settings. OnBeforeCreateContainerScriptPath – will be executed before the container is created, the following variables can be used:

  • $AdditionalParameters – will be passed to the New-NavContainer command (navcontainerhelper module).
  • $IncludeTestToolkit – true by default.
  • $DoNotExportObjectsToText – true by default.
  • $EnableSymbolLoading – true by default.

OnAfterCreateContainerScriptPath – will be executed after the container is created, the following variables can be used:

  • $ContainerName – name of the created container.
  • $DockerImageName – name of the used docker image.
Anuncios

Extension : Show invoice/cr. Memo from shipments and receipts

Original URL…

This extension help you to open invoice or credit memo fastly with a new action button from following pages :

  • Posted Sales Shipment
  • Posted Sales Return Receipt
  • Posted Purchase Receipt
  • Posted Purchase Return Shipment
  • Item Ledger Entries

The Invoice/Cr Memo is retrieved with the value entry linked to the item ledger entry. In case of multiple Invoice/Cr. Memo, value entries filtered list will be shown (in this case you should use the standard line function/item invoice button).

This extension is translated in en-US and fr-FR languages.

Download the extension app (not signed) : NavCraft_Accounting Doc. From Item Movment_1.0.0.0.app.zip

Download source code : Accounting Doc. From Item Movment.zip

Developing Business Central Extensions/Apps in Team

Original URL…

I picked up a new challenge these days: for one of our (quite big) customers, we need to develop a solution, based on extensions. In short: ready for the future, easy to upgrade. When I would explain the case in a paragraph, you’d say “this is not NAV”, although when you would really look deep into it, it’s ideal for an Extension-scenario, obviously in combination with Business Central.

The project

In short, let’s define the project like this:

  • Multiple extensions to be made:
    • A W1 Base Extensions with shared functionality and frameworks
    • For each country (like Belgium) a separate extension, specifically for local functionality
  • The timespan is 2 years
  • We are working with 5 developers at the same time on this project, spread over 3 countries. Good developers, which means not really infrastructural-minded.
  • Test Driven development in an agile approach

So, one project, multiple extensions, multiple developers spread over multiple countries (not time zones, luckily), dependencies, different symbols (W1 and BE), … .

Why this blogpost?

Well, we came to the conclusion that “developing an extension” is one thing. One app, usually one developer, some local development environment, it’s all very manageable. But when you try to do development in team – it’s a whole different story. And we have faced (and still facing) quite some points that need attention. You will see this is a train of thoughts – not a complete list – and not all points I have answers to. More than enough reasons to write some follow-up posts in the future ;-).

So why this post? Just to share my experience in this story, and may be also to get feedback from this great community we have ;-). Any feedback is always appreciated – the community is here to make everyone and the product better!

The focus of this post is not on “how to develop in team”, but rather “what I think we need to take into consideration when you do”.

CI/CD

If you have never heard about “Continuous Integration”, “Continuous Delivery” and/or “Continuous Deployment” – now it’s time to dive into it :-). It’s a mature way of handling software these days – and very new for our precious NAV world. Let it be no doubt, also I realize there is no way to work in team without a decent implementation of CI/CD, so we have set up a system that automated quite a lot.

  • GIT with a remote in VSTS takes care of our “continuous integration” of the code of team members.
  • Branch for each “unit of work”
  • We have a master-branch-policy, which means we work with pullrequests to pull changes of our team members in the master branch by build definitions – which also forces us to review the code.
  • These build definitions do quite a lot (analyze, compile, test-deploy, manage manifest, …)
  • We have release pipelines to get our stuff to test-environments (continuous delivery/deployment)

And a lot more – part of which I will address in the upcoming points. Again, this is not about CI/CD, but you’ll see that the concept solves a lot of points, but also introduces some other challenges.

Object Numbers

A challenge you’ll face quite soon is “how to manage object numbers”. As a team, each member has his branch, within his branch, he will start creating tables, codeunits, … . But if you just use that awesome “auto numbering” that comes out of the box with the al-language .. you’ll end up with the same object numbers when you start merging your branch. Compile errors will cause many builds to fail!

You can’t just change the app.json to influence the autonumbering, because things will not compile in a while.

So, in a way, you need to abandon the nice autonumbering, and go into assigning object numbers to people/branches/workitems.

We created a dashboard app to manage all the things (we think) can’t be managed by VSTS – which also includes object numbers and field numbers.

“I need that field you are creating – can you save your object?”

I was a big fan of a central development database. Besides the fact you couldn’t develop in one object with multiple developers at the same time, it mostly had advantages. So easy. At all time, you had an overview of all developments everyone was creating.

Well, I guess I need to grow up now, because now, we are not developing in a database, we are developing in files, distributed on local systems/branches, integrated by VSTS .

So, if you need a field (or any piece of development for that matter) that was created in another branch, but not pushed to master just yet? Well, there are two ways – you start merging the two branches (which I wouldn’t do in a million years) – or you pull the change to master, so the other branch can merge with master, and continue development with those needed fields.

Does the field already exist?

An extra challenge which is quite similar to the above is the fact you don’t have an actual overview of all fields that are being created. Basically a list for all fields, for all tables, for all branches in development.

As said, we are being agile, so for functional and technical people, it’s very nice to be able to check who/what/.. Fields are already created that you can use in the next sprint.. . You can’t just open a table and see what fields are there – there might be some in extension objects – even not pushed to master yet.

We will create a dedicated functionality to “push” our fields from all branches to our dashboard so that we always have a nice up-to-date list to filter and analyse the fields we have at hand ;-).

Breaking schema changes during development phase

You’d say – during development, schema is not important. If you want to delete a field, just delete it. In the world of extensions these days, that means – you will delete all data that comes with your extension. Any breaking change needs to schema to be recreated. And I can tell you, you’ll end up with breaking changes quite soon:

  • Renumber a field
  • Rename a field
  • Delete a field
  • Change datatype

For a development database, I don’t care, but for a test-database, for User Acceptance Testing, or even just functionality testing, it can be devastating to lose the data.

We realized that quite soon in our development cycle, data became important .. and we are not “just” in development phase anymore. When the app ends up in the test system (during release-pipeline), it should be a matter of upgrading, not a matter of “delete data and recreate the schema”.

So the only thing I think we can do is to handle the app as being a release/live app from the very first moment users start testing/data is important. That means: the schema shouldn’t change anymore, and if you have to change schema, it’s a matter of using obsolete fields and creating upgrade scripts – just like you would do in live!

Well, we will be “in development” for about a full year, and during that year, people need to test, users need to perform UATs, … and basically this means if we make wrong assumptions in the analysis of our design and architecture (and in an agile world, it’s not that uncommon) – we might end up with (lots) of obsolete tables and/or fields.

As you might have noticed – I don’t really have a comfortable way of handling this just yet… working on it!

Dependencies

Dependencies is a nice ability we have with Extensions. But I can tell you, when you are developing the base app, and the dependent app at the same time .. In team .. It does bring some challenges as well.

In a way, we all are dependent from different symbols – as we all want to run our app in multiple countries .. . In my view, it’s a good idea to include in your build process a workflow that tests if your app would deploy on these countries as well. That’s why our build looks something like:

  • Compile
  • Run code analysis
  • Create app
  • Deploy on W1
  • Deploy on BE
  • Deploy on CountryX

Thanks to this, we already avoided an error in development where we added a field that already existed in the BE database .. . It compiled well in W1, but it didn’t in BE.

On top of that, you might create two apps, where one is dependent on the other. In that case, you also need to include in your build process to test that dependency at all times. A simple compile of the dependent app is easy, but actually, when you change the base app, you should also see if your dependent app still compiles. In our scenario, a change on the base app, results in a build of all apps that are dependent from it.

Distributed development environment

With C/SIDE, lots of partners implemented a “centralized development environment”, which is quite unorthodox, but C/SIDE allowed it, and it was super easy. At all times, the developments were in one database, one overview, on “thing” to maintain.

With AL, we are “forced” to do it the right way. This is positive, for sure, but it’s different. Now, code will have be “continuously integrated” (which is indeed the “CI” part) – which means, merged. You don’t “just” checkout (and reserve) and object, you commit and merge, work with branches, … . All good, but different.

We use Git on VSTS, and work with pullrequests to pull new code in the master branch, which means we have introduced a code review together with this. Good stuff!

Docker

But this distributed environment also brings some challenges, which Docker can solve. Everyone needs to work on his own isolated development environment – you can’t be deploying your apps to the same NST as your colleague.

Docker solves this – as it’s easy to create a local container so you could start development – but it also comes with the assumption that people are able to work with Docker.

We have seen this is a difficult one – lots of developers (including me) are not infrastructural-minded. And in that case, “docker” becomes a lot more difficult.

We decided to go for a managed approach – if the developer creates a branch, we will spin up a new environment for him to do his deployments. Thing is, we wanted to do this as easy as possible – with the possibility for him to work remote, with only one entrypoint, like: project.ifacto.be/branchname . With Docker, that gets more difficult, as now we can’t just depend on a serverinstancename (which in Docker is always “NAV”), but we need to do some routing for every container we spin up. It’s not just 1 IP with multiple services, but it’s always a different IP with 1 service.

The alternative is that every developer is going to manage his own docker container on his own VM on his own laptop. All of a sudden, we’d have to support local development environments of developers on their laptops – I don’t see that feasible, actually.. . At least not yet ;-).

Tests

With tests, we have one big “challenge”: it’s not possible to run a complete default test just yet. So for the moment, in our setup, running a complete test is not implemented.

But that doesn’t mean we can’t implement our own tests – and that’s something we do. And we also execute with every single build, triggered by the pullrequests.

On top of that, we should also test all the dependencies as well – if you change something on the base-app, it’s obvious that some dependent app might have failed tests.. That’s another reason to always rebuild the dependent apps as well. Keep that in mind ;-).

Translations

As you probably know, the way we will do translations has grown up as well. Developers don’t need to be linguistics anymore ;-). Translation is done through xlf-files, “outside” the code.

But this also means we need to manage that in our process.

All the team members will be creating their own branched xlf-file – which will conflict every single time you try to merge branches. So best thing is – handle translations totally outside the build-scope. At this point, I put the xlf-files in .gitignore. What I haven’t done yet is implement a workflow for handling translations just yet, because we don’t need it yet.

That’s it for now .. I hope this post at least opened some eyes, or confirmed some concerns, or even helped you in solving some of the points … . In any case, Belgium just lost their semi-finals on the World Cup – so I’m signing out and going to be grumpy now ..

How to get the new Dynamics 365 Business Central and extend it

Original URL…

What is it?

Dynamics 365 Business Central is the new and, I hope, the last official name of Dynamics NAV.

No more Dynamics 365 Finance and operations Business Edition, no more Dynamics 365 “Tenerife”, even no more Dynamics NAV (basically it is yes now, until Q4 2018, when D365BC on-premise will be released)!

The official announcement was made 4 days ago and can be found here.

2 days after at DirectionsASIA we’ve got much more information about the product from Microsoft team.

The official hashtag is #MSDyn365BC, and if you will search it on twitter you will find a huge portion of images and info about new web client, new roadmap, new possibilities and so on.

But this blog is not about What’s new staff. It is about How to get it?

Step 1. Register on collaborate portal

If you are already registered – skip it. If not, go to https://aka.ms/collaborate and register.

The registration process is very nice described here https://docs.microsoft.com/en-us/collaborate/registration

Just go steps by steps, and you should be able to see this

 

Step 2. Register on Ready to Go program (Updated)

After publishing first version of this blog, I’ve got many questions about why D365BC is not visible on collaboration portal. Because of this step was missed. Sorry. Updated.

If you are already registered – skip it. If not you should sign up for Ready to Go program http://aka.ms/ReadyToGo

To do so, after step 1, please write an e-mail to Dyn365BEP@microsoft.com

When contacting, please provide following information:

Publisher display name Name Email Role
Chosen during registration, should be the same for all users User 1 Email 1 Power user (can access content and add new users to engagements)
User 2 Email 2 Participant (can access content)
Etc. Etc. Power user

The registration process should take 1-2 business days.

After successful registration, you will be able to see this

 

YES! This is pre-release version of NEW Dynamics 365 Business Central.

If you click on it you will find something interesting, guess what?

.Zip, DVD?  No =)

You will find 2kb txt file with instructions of …. How to get it via Docker.

Download it.

Advantages of Ready to Go program

Ready to Go program, it is not only the possibility to download and play with pre-release versions of Dynamics 365 Business Central.

The idea is to prepare every partner for new SaaS world. It consists of training materials on http://aka.ms/ReadyToGoOnlineLearning and also potentially coaching through a Development Centres

BTW. If you want to prepare yourself for new modern SaaS world, you can also contact me for individual workshops and coaching.

Step 3. Create new Docker container with Dynamics 365 Business Central

But first, Install Docker.

If you don’t have docker, then download and install it. You can choose to download a full docker client or only a module.

You can download and install docker as a module executing this code in PowerShell

invoke-webrequest -UseBasicparsing -Outfile docker-17.09.0-ce.zip 
https://download.docker.com/win/static/stable/x86_64/docker-17.09.0-ce.zip
# Extract the archive.
Expand-Archive docker-17.09.0-ce.zip -DestinationPath $Env:ProgramFiles

# Clean up the zip file.
Remove-Item -Force docker-17.09.0-ce.zip

# Install Docker. This requires rebooting.
$null = Install-WindowsFeature containers

# Add Docker to the path for the current session.
$env:path += ";$env:ProgramFiles\docker"

# Optionally, modify PATH to persist across sessions.
$newPath = "$env:ProgramFiles\docker;" +
[Environment]::GetEnvironmentVariable("PATH",
[EnvironmentVariableTarget]::Machine)

[Environment]::SetEnvironmentVariable("PATH", $newPath,
[EnvironmentVariableTarget]::Machine)

# Register the Docker daemon as a service.
dockerd --register-service

# Start the Docker service.
Start-Service docker

#Next steps are optional!

#Run test container, to check that Docker is alive
docker container run hello-world:nanoserver

# Check what containers do you have
docker ps

Next, install Navcontainerhelper

I personally love what Freddy has done to us. So I will use navcontainerhelper to simplify my work.

Run next script in PowerShell

Install-Module -Name navcontainerhelper -Verbose

If, you are running it on Windows10, check that your policy is allowed you to install new modules.

If it is restricted, then change it

Create new D365 Business Central container

1)    Login to azure container register, to be able to pull (download) D365BE Image (also in powershell).

docker login "navinsider.azurecr.io" -u 
"insert-user-id-here-from-Build-21063.txt-file" -p " 
insert-password-here-from-Build-21063.txt-file "

2)    Choose what version of Business Central do you want.

Currently available 14 Versions and W1!

dynamics-nav:11.0.21063.0

dynamics-nav:11.0.21063.0-finat

dynamics-nav:11.0.21063.0-finbe

dynamics-nav:11.0.21063.0-finca

dynamics-nav:11.0.21063.0-finch

dynamics-nav:11.0.21063.0-finde

dynamics-nav:11.0.21063.0-findk

dynamics-nav:11.0.21063.0-fines

dynamics-nav:11.0.21063.0-finfi

dynamics-nav:11.0.21063.0-finfr

dynamics-nav:11.0.21063.0-fingb

dynamics-nav:11.0.21063.0-finit

dynamics-nav:11.0.21063.0-finnl

dynamics-nav:11.0.21063.0-finse

dynamics-nav:11.0.21063.0-finus

3)    Create new container

I will use W1 version, so I run next command in PowerShell

New-NavContainer -accept_eula -alwaysPull -imageName 
"navinsider.azurecr.io/dynamics-nav:11.0.21063.0" -containerName D365BC-W1

and after a while – about 5 minutes of pulling (depends on your internet speed), and 2 minutes of initialization – we have it!

This wonderful, modern look and feel web UI

Hmm…. Not really what I was expected =)

Let’s try old trick =) We will add ?aid=fin to the end of our URL.

So url will be http://d365bc-w1/nav/?aid=fin

 

Much better now!

By the way, if you will create a container from US image (dynamics-nav:11.0.21063.0-finus), then we will have new web UI by default.

 

Step 4. Extend it!

Install Visual Studio Code AL extension

First copy .vsix file on your host.

To do so, copy a link to vsix file from container creation log

Open it in a browser

Click Save.

Open Visual Studio Code. Go to Extentions -> … -> Install From Vsix

Create new AL project

As usual, press Crtl + Shift + P -> AL:Go

We change server in launch.json (take it from container creation log), and authentication to Windows.

Press Crtl + Shift + P -> AL:Download Symbols

Then we will create some new code.

We will add one more insight to RoleCenter. This is really wow feature of new UI.

After publishing (ctrl+F5) we will see new insight!

And we’ve done!

I encourage you to take all advantages of Collaborate and start exploring Dynamics 365 Business Central right now!

Fastly publish extention from Nav 2018 RTC

Original URL…

If you already working with Nav 2018 extention, you know you have to go through powershell command to import new .app in Nav : How to: Publish and Install Extention

I developped small function to allow you to do this fastly through RTC on Extention management page.

A new button “Import” ask an app file with a file browser then execute the powershell function with admin right to puslish the app. The button “Remove” can be used as Unpublish-navapp function for the selected extention on the page.

Page 2500 Extention Management.zip

  1. Import – OnAction()
  2. TxtLAppFile:= CduLFileMgt.OpenFileDialog(‘NAV Extention’,‘*.app’,);
  3. IF TxtLAppFile = THEN
  4. EXIT;
  5. TxtLPowerShell := ‘Import-Module ‘ + APPLICATIONPATH + ‘NavAdminTool.ps1’‘;’;
  6. TxtLPowerShell += ‘Publish-Navapp -ServerInstance ‘ + GetInstanceService + ‘ -skipverification -path ‘ + TxtLAppFile + ;
  7. CREATE(Shell, TRUE, TRUE);
  8. Shell.Run(‘powershell.exe “Start-Process powershell \”-NoProfile -ExecutionPolicy Bypass -Command ‘ + TxtLPowerShell + ‘\” -Verb RunAs; ‘);
  9. CLEAR(Shell);
  10. CurrPage.UPDATE;
  11. Remove – OnAction()
  12. CurrPage.SETSELECTIONFILTER(RecLExtention);
  13. IF RecLExtention.FINDFIRST THEN BEGIN
  14. IF RecLExtention.Installed THEN
  15. ERROR(CstLInstalled);
  16. TxtLPowerShell := ‘Import-Module ‘ + APPLICATIONPATH + ‘NavAdminTool.ps1’‘;’;
  17. TxtLPowerShell += ‘Unpublish-Navapp -ServerInstance ‘ + GetInstanceService + ‘ -Name ‘ + RecLExtention.Name + ;
  18. CREATE(Shell, TRUE, TRUE);
  19. Shell.Run(‘powershell.exe “Start-Process powershell \”-NoProfile -ExecutionPolicy Bypass -Command ‘ + TxtLPowerShell + ‘\” -Verb RunAs; ‘);
  20. CLEAR(Shell);
  21. CurrPage.UPDATE;
  22. END;

 

Accessing a control add-in in a dependency extension

Original URL…

Long time no see, eh? Time flies, what do you know…

I am thrilled to still find you here. Honestly, I wasn’t sure this morning if I was about to even find this blog where I left it seven months ago. Cool to find both my blog and you in good shape, patiently waiting for my contribution.

This morning I had a call with a partner asking if it was possible to deploy a control add-in in such a way that other partners could use its functionality from their own extensions. My answer was, and it still is – well, it should be possible, but I don’t know for a fact because I never tried it.

So let’s try it and find the answer together.

(It goes without saying, but I’ve learned that things that “go without saying” often don’t, so let me go with saying it: this is about Extensions V2, NAV 2018, and Business Central; no NAV 2017 stuff here. And no animals were harmed while during writing of this blog, yet…)

To keep my partner safe and anonymous, and stay GDPR compliant in and out, let’s imagine this imaginary scenario: you are building a cool horizontal feature that does “things” in the back end, but also exposes a little bit of front-end sugar for other NAV partners to consume. So you want to make your control add-in accessible to them.

If this was all about pure AL – it’s a no brainer. Your workflow is as follows:

  1. Create and build your extension
  2. Ship your .app file together with your app.json manifest file to your partners.

Your partners, who want to tap into your functionality from their extension, need to do this:

  1. Create their extension
  2. Make your .app file available in their package cache path
  3. Use the information from your app.json to declare a dependency on your extension from their app.json

If your extension uses publicly accessible stuff, such as a table or a codeunit, or events, your partners can now tap into this functionality from their extension (by reading your table, or calling your functions, or subscribing to your event publishers).

But what if you also make a control add-in a part of your extension? Let’s try it out together.

Creating the “horizontal” extension (the “dependency”)

Let’s get the first part done. The “your” part where you are creating your extension that includes a control add-in to expose to your partners. Your amazing new control add-in will expose a button, that will have a caption of your choosing, and will allow your partners to respond to its click event. Crazy stuff, right?

I’ve got myself a nice and fresh VM this morning from aka.ms/getnav to have access to latest CU (being CU6 from June 6). You may want to get one for yourself, too.

Once it’s up and running, start VS Code, and run the “AL: Go!” command from the command palette. Then choose the “BaseExtension” as its name, select “Your own server” as the server, enter your username (mine was “admin”), enter your password (mine was, whoops, I am not telling you what it was!).

When it’s done, which takes like a half a femtosecond, you have your launch.json file open. Go back to your landing page in IE, or access it from the desktop if you closed it) and copy the last four lines from it (under “launch.json settings” subsection) and paste it inside of your launch.json (make sure to overwrite those same settings in there, which is all settings after “name” and before “startupObjectId”). Also, set the “startupObjectId” to 50100 to run your first page.

If you did it correct, you’ll get something like this:

SNAGHTML2a47603d

A few more housekeeping steps:

  • Delete the HelloWorld.al file
  • Edit the app.json file to declare your extension. Mine changed the name, the publisher, and the idRange sections, and it now looks like this:
    image

Now, time for the real stuff. Create a new file and name it “ControlAddIn Base Control.al”

In it, request some real estate from the app, declare a startup script and a “normal” script, then declare a procedure to set a caption on the button, and two events (one to indicate the control is ready, and one to include the implementation for your declared methods). If you care, declare a stylesheet file to make it look nice, too.

If you are as good as I am, yours will also look more or less like this:

image

Rats! I am not that good – there’s red stuff in here. Let’s fix it.

Create two folders, call one “Scripts”, and another one “Styles”, and in them, create the files as declared in your control .al file. Mine are “startup.js”, “baseControl.js”, and “baseControl.css”.

This takes care of the “red stuff” in the control add-in object (it may require you to close and re-open the editor tab for the control add-in object, though).

Now put beef in these files as indicated in the screenshots below.

image

This one was tough! It calls the OnControlReady event when the control add-in starts. Now, let’s get the easier ones done, too.

image

This is your implementation. It contains one function to set caption, as declared in your control add-in object file. It’s not the smartest piece of JavaScript code ever written, but gets the job done. If there is no button, it creates one, sets its caption, and saves the reference for future use. The button, when clicked, invokes the OnClick event, duh!

And last, but certainly the least and entirely optional, add some CSS juice to get rid of Times New Roman:

image

(Times New Roman makes my toenails curl up)

Finally, let’s test if this works as expected. Create a “Page 50100 Test Control.al” file and populate it with bare minimum of al to try this out:

image

Ctrl+F5, sign in once to deploy your control add-in, then sign in once more to, well, sign in to your NAV, and then perform this complex set of steps:

image

So, we did it, or so it seems.

Now, you are good to deploy this to your partners.

Deploying it to your partners

To deploy it to your partners, just do this:

  1. Take your app file (mine is “Vjeko.com_Control Add-in Base_1.0.0.0.app”) and your “app.json” file
  2. Send them to your partners.
  3. Done.

Using your partner’s control add-in

Time to put your partner’s shoes on. You are now not you anymore; you are now your partner, the one who uses your extension. Yes, I confused myself, too, I tend to do this.

First thing, create your extension, the one that will (try to) consume your partner’s shiny button control add-in.

It’s easy, “AL: Go!” once again, follow the same first bunch of steps as you did earlier, up to deleting the HelloWorld.al file.

First step, sort out the app.json manifest file. This time (apart from using a different control add-in range, which should go without saying, but doesn’t, just in case) you need to declare a dependency on the Base Extension you (when you were your partner) created earlier. To do that, use the info from the “app.json” file you received from your partner. This is what I’ve got:

image

Now, to make it simpler, run the “AL: Download symbols” command, to get the symbols files stored in your package cache path. If you didn’t do anything fancy, it should be right inside your workspace, under .alpackages. Now, select any of the .app files inside your .alpackages folder, press Alt+Shift+R to reveal the .alpackages folder in Explorer, and then paste the .app file you received from your partner right in there together with the two files already in place. Now, if you are doing this from the same machine, that file will already be there, because VS Code was smart enough to download that file together with base NAV files automatically.

Good, now let’s try to see if we can use the control add-in your partner extension exposes.

Create a “Page 50110 Test Partner Control.al” file and add al code to define the page that, well, at this stage, attempts to use your partner’s control add-in. For all I care, it can be the exact copy of the original page 50100 from the previous workspace, save for the object ID and name, which should be 50110 “Test Partner Control”. If you didn’t care more than I did, this is what you have at this stage:

image

Last step, change the startupObjectId from 22 to 50110 inside your launch.json file, cross your fingers, close your eyes, and press Ctrl+F5. Okay, if you can’t do it with your eyes closed, open your eyes, position your left pinky on Ctrl, your left middle finger on F5, close your eyes, and click.

(you may open your eyes now…)

image

Yaay! It works!

Really, I didn’t expect anything less, but now I know for a fact. And so do you. You’re welcome!

Publishing and Installing an Extension v2.0

Original URL…

  1. Publish and synchronize an extensión
  2. Install an extension
  3. See Also

The AL developer environment is evolving with frequent updates. To stay up to date on the latest information and announcements, follow us on the Dynamics NAV Team Blog.

To make your extension available to tenant users requires three basic tasks: publish the extension package to the Dynamics 365 Business Central server instance, synchronize the extension with the tenant database, and install the extension on the tenant.

Note

This article describes how to publish and install the first version of a V2 extension. If you want to publish an install newer version of an extension, see Upgrading Extensions V2.

Publish and synchronize an extension

Publishing an extension to a Dynamics 365 Business Central server instance adds the extension to the application database that is mounted on the server instance, making it available for installation on tenants of the server instance. Publishing updates internal tables, compiles the components of the extension behind-the-scenes, and builds the necessary metadata objects that are used at runtime.

Synchronizing an extension updates the database schema of the tenant database with the database schema that is defined by the extension objects. For example, if a table or table extension is included in the extension, then the respective full or companion table is created in the tenant database.

To publish and synchronize an extension

  1. Start the Microsoft Dynamics NAV Administration Shell.
  2. To publish the extension, run the Publish-NAVApp cmdlet.The cmdlet takes as parameters the Dynamics 365 Business Central service instance that you want to install to and the .app package file that contains the extension. The following example publishes the extension MyExtension.app to the YourDynamicsNAVServer instance.
  • Publish-NAVApp -ServerInstance YourDynamicsNAVServer -Path ".\MyExtension.app"
    
  • To synchronize the schema of a tenant database to the extension, run the Sync-NavApp cmdlet.The following example synchronizes the extension MyExtension with the tenant:

 

  1. Sync-NavApp -ServerInstance YourDynamicsNAVServer -Name ExtensionName 
    -Path “.\MyExtension.app” -Tenant TenantID
    

    Replace TenantID with the tenant ID of the database. If you do not have a multitenant server instance, use default or omit this parameter.

The extension can now be installed on tenants.

Install an extension

After you publish and synchronize an extension, you can install it on tenants to enable the extension and make it available to users in the client. Installing an extension can be done from the Dynamics 365 client or Microsoft Dynamics NAV Administration Shell.

Note

Installing an extension will run any installation code that is built-in to the extension. Installation code could, for example, perform operations like populating empty records with data, service callbacks and telemetry, version checks, and messages to users. For more information, see Writing Extension Install Code.

To install an extension by using Microsoft Dynamics NAV Administration Shell

  1. Start the Microsoft Dynamics NAV Administration Shell.
  2. To install the extension on one or more tenants, use the Install-NAVApp cmdlet.The following example installs the extension My Extension for Tenant1 and Tenant3. In single-tenant deployments, you either specify default as the tenant ID, or you omit the –Tenant parameter.
  1. Install-NAVApp -ServerInstance YourDynamicsNAVServer -Name 
    ”My Extension” –Tenant Tenant1, Tenant3  
    

To install an extension by using the client

  1. In Dynamics 365 Business Central , use search to open the Extension Management page.In the Extension Management window, you can view the extensions that are published to your server. For each extension, you can see the current installation status.
  2. Choose an extension to see additional information and to install the extension.
  3. Review and accept the license agreement.
  4. Choose the Install button to install the extension.

See Also

Unpublishing and Uninstalling Extensions
Developing Extensions

¿THE QUESTION? – UPGRADE VS EXTENSIONS – NAV Extensions after the release of Dynamics NAV 2018

Original URL…

he built-in Visual Designer in Dynamics NAV 2018 and Visual Studio Code are expected to replace the familiar C/SIDE environment as a development tool. However, that’s not just a matter of clicking on a button. It means learning a new way of development, a new language (with old components, though) and a new way of deploying customizations.

All customizations, including customizations on top of customizations are going to become Dynamics NAV Extensions; and that’s a good thing for the industry.

.Net users might say, okay that’s how we do things anyway. But, for NAV developers, this is a big change because we are used to changing any existing piece of code.

So why not work towards a model that is as flexible as .Net and solves NAV upgrade issues as well? Let’s not stick with the traditional “copy-paste-modify” but just “reference-and-extend”.

The downside of customizations is that they often introduce challenges when upgrading Dynamics NAV. It’s exponentially harder to upgrade a solution from one version to the next when changes have been made to the underlying solution. Dynamics NAV Extensions, particularly the latest version – Extensions 2.0 – solves this problem.

Instead of defining customizations in the original source code, Dynamics NAV Extensions are written in parallel with the solution source. The integration with the source code is handled with events.

An extension can add new objects and extend existing objects that are present in the solution. The extension code is packaged in a file, which you can easily deploy to your solution. This allows you to upgrade the underlying solution and, as long as the events remain, the extension will behave in the same way from version to version.

With Dynamics NAV 2018, you can have multiple extensions installed. It’s also possible to make a dependency reference from one extension to another. The question is “how?”  When downloading symbols, you’re only getting system and application data.

Please be aware, that several of the extensions installed in NAV2018 are still V1 extensions, and you can reference and download symbols, but the symbol file is empty. So currently it’s not possible to reference a V1 extension from a V2.

 

More Info…

Dynamics NAV solutions can be customized by partners, value-added resellers (VARs), and even some customers. This is an important benefit of the product and the service continues to be available. However, it has traditionally been carried out by overlayering the application code. The move to the cloud with more agile servicing and frequent updates requires a less intrusive customization model that makes updates less likely to impact custom solutions. This new model is called Dynamics NAV Extensions and will probably replace customization.

Dynamics NAV Extensions are a way for Microsoft Dynamics NAV developers and ISVs to extend the functionality of NAV without modifying Microsoft’s original source code. With the new model, when you come to upgrade Dynamics NAV with a cumulative update, you no longer need to merge all the customized objects. That means less upgrade issues.

With NAV Extensions, you can add functionality without changing the standard solution from Microsoft. This has the obvious advantage that major NAV upgrade projects are no longer necessary. Once you are using Extensions, the customizations no longer represent a problem when upgrading to the latest version of the solution.

If you want to prepare for this new model, you should start to work with Dynamics NAV Extensions today.

Barriers to editing Dynamics NAV extensions

Dynamics NAV Extensions are packages that contain additional functionality, report layouts (at least starting in NAV 2017), permissions, and more. The packages can be easily installed, uninstalled and upgraded without affecting the Dynamics NAV source code.

Once the Dynamics NAV extension package has been created, it is no longer easy for others to view the code of the extension, which means that your code is protected. You can view the source code of an extension through the debugger, but you cannot access the code through the development environment and you can’t modify an extension unless you have the source code.

Technically, there is no real problem. The new solution will work and the customer will be able to use the custom functionality. However, if another party wanted to further modify the functionality, it would not be possible because the functionality can only be modified if the developer has the source code.

Evolution of Dynamics NAV Extensions

Dynamics NAV Extensions Version 1

Microsoft Dynamics NAV 2016 incorporated Extensions V1. However, at that stage it was more of a concept than a practical working tool. In reality, it was quite a pain to develop customizations with Dynamics NAV Extensions because you had to restructure code and not all object types could be extended – only pages, tables and code units.

Dynamics NAV Extensions Version 1 did not include:

  • Add-Ins
  • Web services
  • Job Queues
  • Reports
  • Translations and much more

These customizations are usually easy to create using events. So, you can raise an event in your extension and have another extension subscribe to this event, perform certain actions and pass data back. In Extensions V1, this is difficult to do, because you actually need to provide the source code of your extension to everyone who wants to work with your events or with your data structure.

Dynamics NAV Extensions Version 2

Dynamics NAV Extensions V2 helps create a more modern development environment that supports better functionality. You can actually add your events, raise them and then subscribe to them in another extension. You can also use their tables or other functionality without requiring the source code of the base extension.

Microsoft Visual Studio Code downloads the symbols for all dependent extensions. This means, as long you have the base extension installed in your development system and define this extension as a dependent extension, you can subscribe to the event.

How Dynamics NAV Extensions can make Dynamics NAV upgrades easier

We believe that in most cases, NAV Extensions can help developers to upgrade without any problem. Why in most? Because during any Dynamics NAV upgrade we currently carry out (which is classic, nothing to do with Extensions), we can upgrade the most part of the code without conflicts.

What about the exceptions? There are just a few scenarios where you may need to rethink your solution. Here are some examples:

  • When new functionality in an upgraded version of NAV can replace your extension. However, they probably can co-exist.
  • When Microsoft redesigns part of a solution. Even this doesn’t need to be a major problem. If, for example, you are upgrading to NAV2017, a major redesign of CU80 isn’t that much of a problem thanks to the hooks pattern. Decent code design always helps. If you have been using extensions, you would have been using events anyway, so that’s even less of a problem.

Visual Studio Code and Dynamics NAV Extensions

Visual Studio Code (VS Code) is a lightweight source code editor, running on Windows, Mac and Linux. Developing for NAV on your Linux box, doesn’t that sound cool? It has built-in support for JavaScript, TypeScript and Node.js. And it supports extensions (but not NAV Extensions) for other languages, like C#, Python and PHP.

The Microsoft NAV team has created a new VS Code extension that enables the creation of objects in AL language. But beware; creating a VS Code extension for a completely new language is not straightforward! It requires the creation of a model to support intelligence and the creation of a new compiler. That sounds easy, but it is not.

NAV Extensions after the release of Dynamics NAV 2018

The built-in Visual Designer in Dynamics NAV 2018 and Visual Studio Code are expected to replace the familiar C/SIDE environment as a development tool. However, that’s not just a matter of clicking on a button. It means learning a new way of development, a new language (with old components, though) and a new way of deploying customizations.

All customizations, including customizations on top of customizations are going to become Dynamics NAV Extensions; and that’s a good thing for the industry.

.Net users might say, okay that’s how we do things anyway. But, for NAV developers, this is a big change because we are used to changing any existing piece of code.

So why not work towards a model that is as flexible as .Net and solves NAV upgrade issues as well? Let’s not stick with the traditional “copy-paste-modify” but just “reference-and-extend”.

The downside of customizations is that they often introduce challenges when upgrading Dynamics NAV. It’s exponentially harder to upgrade a solution from one version to the next when changes have been made to the underlying solution. Dynamics NAV Extensions, particularly the latest version – Extensions 2.0 – solves this problem.

Instead of defining customizations in the original source code, Dynamics NAV Extensions are written in parallel with the solution source. The integration with the source code is handled with events.

An extension can add new objects and extend existing objects that are present in the solution. The extension code is packaged in a file, which you can easily deploy to your solution. This allows you to upgrade the underlying solution and, as long as the events remain, the extension will behave in the same way from version to version.

With Dynamics NAV 2018, you can have multiple extensions installed. It’s also possible to make a dependency reference from one extension to another. The question is “how?”  When downloading symbols, you’re only getting system and application data.

Please be aware, that several of the extensions installed in NAV2018 are still V1 extensions, and you can reference and download symbols, but the symbol file is empty. So currently it’s not possible to reference a V1 extension from a V2.

Microsoft tools for automatic migration to Dynamics NAV Extensions

To help develop solutions for this new programming environment, you can use a set of new Microsoft developer tools to build, test, and deploy NAV Extensions.

Microsoft In-App Designer

In the client, you can switch to In-App Designer mode. This enables you to change the look and feel of the client quickly and easily. Using this tool, you can define the elements (such as fields or groups) that appear on a page and change how they are displayed. You can also use In-App Designer as an interactive tool to create extensions based on changes you make in the client.

Microsoft In-App Designer includes a wide range of important features, such as:

  • Adding a field from the source table to a page
  • Moving a field to another position on a page
  • Removing a field from a page
  • Previewing your design in desktop, tablet, and phone clients
  • Saving the changes for the tenant or saving as an extension package file in Visual Studio Code.
  • Microsoft Dynamics NAV Developer Preview 3 (Coming soon)
  • Changing the caption of a field on the page.
  • Adding, moving, renaming, and removing an action.
  • Adding, moving, and removing page parts.
  • Adding new pages

Conclusion

Erik Ernst, a Microsoft Dynamics NAV MVP, commented, “Personally, I’m thrilled about these new extensions. Not so much because they are special or fantastic. But because they allow me easily to remove them again, or simply not install them. The future direction is clear. Some years down the road, then we will only have the World-Wide version. Localizations are just Extensions, either coming from Microsoft or ISVs.”

A statement from Microsoft added, “There are no plans to stop partners from modifying NAV. But you can see it as an option to transform gradually and get your developers used to using Dynamics NAV Extensions when possible, which will also have a side effect of making upgrades easier. So it should be more of a journey than a hard point in time when you must move from one way of doing things to another.”

At Simplanova, we truly believe that Dynamics NAV Extensions is the solution for many of the pain points we have today. Maybe they are not completely fixed yet, but in the near future they will be! That applies to on-premise, for customizations, and for your private and public cloud. And guess what, for Dynamics365, they are already there!

Add-on upgrade to extensions has become much more relevant with Microsoft Dynamics NAV 2018 version and Extensions 2.0. If you choose Simplanova services to upgrade your Add-on to NAV Extensions, we will move Dynamics NAV code customization to events and redesign the code for a successful migration from Add-on to Extensions.

Simplanova has long experience in Microsoft Dynamics NAV services for ISV and VAR partners. If you would like to discuss how our Dynamics NAV Extensions service can help your business, just fill in the form below.

The post Dynamics NAV Extensions will make NAV upgrades much easier appeared first on Simplanova.

On-Premise Extensions & Customer Licenses

Original URL…

On my task list for one of my customers was a nicely isolated module that I could make into an extension.

I’m a huge fan of making many small extensions rather than trying to put all of one customers modifications in one project.

In this case it is a side-by-side project with C/Side so I have created my own app file for the packages. I’ll see if I can blog somewhat about that later.

Extensions are not just for AppSource

Some people seem to think that on-premise we can just as well continue to use C/Side and even though I am a huge C/Side fan I have to disagree.

On-Premise extensions have a lot of value, especially because extensions enforce discipline.

The Caveat

The biggest challenge that I face when programming bespoke Extensions On-Premise is the deployment. Microsoft has made the testing of the license very strickt. In fact, it is more strickt than the runtime check which in my opinion is a bug. Microsoft however has a different opinion. Business Central in fall with the new license model solves it because then, there is no more license.

Temporary Tables

Everyone who has attended my Programming Master Class (800 of you) knows that using Temporary Tables as containers of code are one of the most powerful assets for clean code and reducing code cloning.

The Table behaves as a class with methods and properties and actually replaces the need for Code Units completely. Well, almost.

Using these tables in C/Side is free. It has always been free. End Users only have to pay for tables in their licenses if they write data to the database.

When you ship an extension with a table object that is outside of the customers license you’ll get an error message. The publishing process does not check your code for actual inserts and it probably could not even do that if they wanted to.

In C/Side end-users can import objects with a fob file that are outside of their license.

PowerShell to the Rescue

I’ve created a small PowerShell script that temporarily changes the license at the End-User and later switches it back. There is a lot of clean up to do but for me it was a huge time saver.

My plan is to somehow make this PowerShell script run directly from Visual Studio Code and launch the Windows Client instead of the Web Client.

Please not that I don’t dislike the Web Client but we have some pages in our solution that not yet render perfectly and have to be replaced first with another solution (probably Angular w. DevExpress).

Here is the script.

Don’t expect rocket science. I try to keep my PowerShell understandable.

Set-ExecutionPolicy unrestricted

import-module "C:\Program Files (x86)\Microsoft Dynamics NAV\110\RoleTailored Client\NavModelTools.ps1"
import-module "C:\Program Files\Microsoft Dynamics NAV\110\Service\NavAdminTool.ps1"

$ServiceTier = "2018DEV"
$Version = "1.0.0.1"
$xVersion = "1.0.0.0"
$AppFolder = "\\DynamicsNAV\Extensions\Performance\Version\"
$AppSource = "\\DynamicsNAV\Extensions\Performance\Source\"
$AppFile = "Mark Brummel_Performance_" + $Version + ".app"
$AppName = "Performance"

#Get-NAVAppInfo -ServerInstance $ServiceTier

Move-Item -Path $AppSource$AppFile -Destination $AppFolder$AppFile -Force -ErrorAction Ignore

Import-NAVServerLicense -LicenseFile "\\License\Development.flf" -ServerInstance $ServiceTier
Restart-NAVServerInstance -ServerInstance $ServiceTier

Uninstall-NAVApp -ServerInstance $ServiceTier -Name $AppName -Version $xVersion
Unpublish-NAVApp -ServerInstance $ServiceTier -Name $AppName
Publish-NAVApp -ServerInstance $ServiceTier -Path "$AppFolder$AppFile" -SkipVerification
Install-NAVApp -ServerInstance $ServiceTier -Name $AppName -Version $Version

Import-NAVServerLicense -LicenseFile "\\License\Customer.flf" -ServerInstance $ServiceTier
Restart-NAVServerInstance -ServerInstance $ServiceTier

#Sync-NAVApp -ServerInstance $ServiceTier -Name $AppName -Mode Clean -Force
#Sync-NAVApp -ServerInstance $ServiceTier -Name $AppName -Mode Add

The bottom two commands are commented out. I use them when I make schema changes to avoid having to create upgrade codeunits during the development process.

The Version and xVersion are because I like to keep some old versions of the .app file while I do the development. The UnInstall and UnPublish is not required if you increase the build number with each build

A tip to Microsoft would be to implement some of the old C/Side code into the Extensions module that only deleted data/colums for those tables/columns that were really changed.