AEM-DTM-Target Integration. Simplified! – Part1

Let’s start with the introduction of three gems of Adobe Experience Cloud crown:

  1. AEM – Adobe Experience Manager(AEM) is a web content management system with a lot of powerful features to manage, organise and deliver the content with best in class experience. AEM came to Adobe with the acquisition of Day Software in 2010, earlier it was known as Day CQ (Communique). For this article, we are using AEM 6.3 version.
  2. DTM – Dynamic Tag Management(DTM) is a tag management tool which offers tools to integrate Adobe Experience Cloud Solutions as well as solutions like Google Analytics etc.DTM used to be known as ‘Satellite’ before Adobe had acquired it in 2013.
  3. Target – Adobe Target is a personalization solution of Adobe Experience Cloud Suite, which delivers the personalisation of content based on different type rules. Target came to Adobe with the acquisition of Omniture in 2009. In this article, we are going to demonstrate the Target VEC capabilities.

AEM can be integrated with Target directly without using DTM. However, it’s recommended to use the DTM in between due to listed benefits:

  1. Efficiency –DTM allows you to optimise the loading time of multiple tools and marketing tags.
  2. Reduce Cost – Minimise the development cost as once Data Layer is in place multiple solution can leverage the same.
  3. Flexibility – DTMallows you to quickly test and optimise tags without being bound by release cycles.
  4. Multiple Sites – Using DTM you can manage multiple sites from the single console.

AEM is usually connected with DTM using AEM Cloud Services and Target is connected to DTM using Experience Cloud configuration. Here is the relevant diagram:


Request Flow:
How a request flows from AEM to Target via DTM:

  1. AEM starts rendering a site page.
  2. Site has a DTM script attached along with Data Layer. DTM script is associated with a relevant property. Data Layer discussed below.
  3. DTM Property has Target added to it, which raises a request to Target and share required data.
  4. Target pushes its response back to AEM instance using mbox js. Note: Now mbox.js has a newer and updated alternative which is known as at.js
  5. AEM page got Target response which gets rendered on the page.     

Data Layer:
As you might have noticed in request flow that AEM site has got the DTM Script and Data Layer associated. Data layer is a W3C standard to store values in javascript object. Data layer is created at AEM and consumed by DTM to get the required data from AEM instance and push to any solution like Analytics, Target etc. We are going to create a sample data layer below in DTM Rules section.

DTM Setup:
To get the access of DTM, one has to raise the request with Adobe Client Care with proper information. Once the access is granted, go to and login with the credentials provisioned by Client Care.

To start with DTM, create a new property with the URL of the website. Thereafter add the required tools in the property like Adobe Analytics or Adobe Target. In AEM configuration, we are going to use same DTM Property in order to integrate the AEM with Analytics, Target, etc.

For this article we are going to add Adobe Target, for this we need Client Code which will be provided by the Client Care or alternatively you can try adding using Experience Cloud membership.


DTM Property

Adding Tool in DTM Property

Adding Target as a Tool in DTM Property

DTM Rules:
In above steps you have created a new DTM property, added Target as a tool in the same property. Now you need to know how to play with data, DTM uses data layer to get the data from AEM instance. For example, using Target you need to render homepage of website based on browser, location, devices etc or you can also get the user specific details like age, gender, interests etc. Hence you need to get all such information from AEM in data layer. For example, here the data layer:

Once the data-layer is all set, fetch the values from data layer and set it in DTM data elements. We use these data elements in different types of rules. DTM has got three type of rules:
1. Event Based Rule – Rules executed when visitor interacts with on-page elements.
2. Page Load Rule – Rules executed when visitor loads the page.
3. Direct Call Rule – Rules executed when you want to tell DTM to do something specifically.



DTM Rules

Creating a data element using data layer is quite straight forward. You can see the below screenshot, here we have created a new data element Page Title which is mapped with pageName field of data layer created above 

Data Elements

DTM Workflow:
Every property in DTM follows a standard workflow. Workflow has got three steps:

  1. Creation of property and rules.
  2. Approve the newly added and modified rules.
  3. Publish the property to make the changes live.

AEM-DTM Integration:
There are two ways to integrate DTM with AEM:

  1. Using Cloud Services
  2. Embed JS
  1. Using Cloud Services:

Using AEM admin console, go to Tools > Deployment > Cloud Services. Find ‘Dynamic Tag Management’ and click on ‘Configure now’. Provide a Title and Name of the configuration keeping Parent Configuration field intact. Enter the details as below:

Property Description
API Token The value of the API Token property of your Dynamic Tag Management user account. AEM uses this property to authenticate with Dynamic Tag Management.
Company The company with which your login ID is associated.
Property The name of the Web Property that you created for managing the tags for your AEM site.
Include Production Code on Author Select this option to cause the AEM author and publish instances to use the production version of the Dynamic Tag Management libraries.

When this option is not selected, the Staging Settings apply to the author instance, and the Production Settings apply to the publish instance.

There is an option to self-host the DTM libraries which means AEM will host the DTM libraries if this option is checked, if we want to use the libraries from the cloud then we should uncheck this option. Here we are using later one.

  1. Embed JS:

Go to the DTM Console using and click on the relevant property.
Go to Embed tab, copy the Header and Footer staging script, go to the AEM author instance and copy the script in header and footer of the website respectively. Similarly, copy the header and footer production script and put in the AEM publish instance.

Let’s say you have integrated DTM with AEM using one of the above-mentioned method, assign the DTM config to the root of your site using AEM Sites console.

DTM-AEM Connection Test:
Now you need to test whether DTM script is getting loaded with your website. In browser, try to open any page of your website where you have assigned the DTM config and open browser console. If “_satellite” object exists then it means DTM script is getting loaded. You can also try retrieving the data elements that you have created in DTM using “_satellite.dataElements”. If you are able to find all the details, you are good to go.

If _satellite object doesn’t exist, it means page doesn’t have the DTM header/footer script. You can view the source of the page to confirm the same.


Satellite Object


Satellite Data Elements



DTM Debugging Tools:
In DTM we play with different types of scenarios and scripts, hence we might need to check if the scripts are working fine and order they are getting executed.
There are couple browser plugins are available which proved to be quite for debugging.

Launch Switch:
It enables the logging in console and as well as we can test the changes using staging script, before making the changes live.

There are two button available on the plugin console. Tweaking them will do the following:
Debug: It enables the logging of the rules, you can also add the logs using _satellite.notify(“info”).
Staging: It will enable the changes you have made in property which you have not published. This is used for testing the changes  before making them live.

Download URL for Chrome:


It shows the detailed information about all the DTM Rules. 
Download URL Chrome  

Adobe DTM Documentation:

This is the first part 1 of “AEM-DTM-Target Integration. Simplified!” where you have learned how to setup DTM, create DTM property and data elements, integrate DTM with AEM and debug DTM. In Part 2, you will learn how to create different types of rules in DTM, use these rules to share the information to Target, Target activities, Target audience and how to use Target VEC (Visual Enhanced Composer) along with the Target debugging tweaks.

Migration of SCR Annotations to DS Annotations

Migration of SCR Annotations to DS Annotations
Change is a rule of life, and if you are not updating yourself on current trends then you are constantly falling behind on new updates that can be essential to your system. With Adobe, they release a new version of AEM  every year. Last year, they introduced SCR annotations, and now they are supporting DS annotations. From AEM 6.2 DS Annotations are supported, and it is highly recommended that you use these in the newer version of AEM.  

Let’s see brief introduction of SCR Annotations and DS Annotations.

SCR Annotations:
SCR stands for Service Component Runtime. The ”maven-scr-plugin” uses scr annotations from the corresponding subproject at Apache Felix. All annotations are in “org.apache.felix.scr.annotations” package. The Apache Felix SCR described by the OSGi Declarative Services Specification is implemented by the “org.apache.felix.scr” bundle.

SCR annotations do not support new feature from R6 and above. It is highly recommended to use OSGi annotations for Declarative Services instead.

DS Annotations:
DS annotation is an official implementation from OSGi R6 (release 6) specification. It is also known as OSGi Declarative Services Annotations.
Remember that declarative services are a compiled time process. For DS annotations to be effective, they must be handled during the build process.
The migration is fairly easy and both annotation styles will work side-by-side while you complete the switch-over.
So here will we discuss how to migrate from SCR annotation to DS annotations.

For DS annotations we have to use “maven-bundle-plugin” instead of “maven-scr-plugin”. Version 3.2.0 or greater.

For DS annotations we need artifacts  “org.osgi.service.metatype.annotationsand org.osgi.service.component.annotations” instead org.osgi.core” and “org.osgi.compendium”.

Java Package:
In DS Annotation, package “org.apache.felix.scr.annotations.*” will be replaced with “org.osgi.service.component.annotations.*”  and “org,osgi.service.metatype.annotations.*”.

Migration of Component and Services:
We used to use @Component, @Service annotations in SCR Annotations while in DS annotation just have @Component annotation with the collaboration of all these annotations.

SCR annotation Implementation:

DS annotation Implementation:

Migration of Sling Servlet:
We used to use @Component, @Service @SlingServlet @Properties in SCR Annotations while DS annotation just has @Component with the collaboration of all these annotations.

SCR annotation Implementation

DS annotation Implementation:

Migration of Custom Workflow process:
SCR annotations Implementation:

DS annotation Implementation:

Migration of Custom OSGi Configuration:
Major changes came in custom OSGi configuration. OSGi annotations provided a flexibility to create configuration in a separate interface and we can use it in any place.

SCR annotations Implementation:

DS annotation Implementation:

Actual Class:

Fig- OSGi Configuration in Felix Console

Migration of OSGi Config:
Migration of OSGi config is quite tricky. There are two scenarios for OSGi config.

  1. You have created new configuration according to DS annotations and you want a config of it for a default value.
  2. You are migrating old custom config to a new one (DS annotations) and you already have config file.

Scenario 1:
If you have created your own custom config. For e.g.


So, your default configuration will be:

Scenario 2:
If you have already default configuration and you are migrating to DS annotation. For e.g.

Config file:

So, your interface will be:

Hope this blog will help you to migrate from SCR to DS annotation.

Thank you!! Happy Learning!!



AEM Version: 6.2
Target Audience: AEM Developers


In AEM 6.2 Workflows, we can trigger a workflow when a DAM Asset is created, modified, or deleted within a given path. In this article, we will explore triggering workflows from our code based on events in the JCR.

Suppose you have a workflow that creates custom renditions of assets in addition to the default AEM renditions, when the asset is under “/content/dam/ProjectName/images/”. You would have set up two launchers for triggering this workflow: one with event type as “Node Create” and one with “Node Modified”. We can also achieve the same functionality through our code, without touching the GUI.


When assets are moved into a certain folder structure in DAM, trigger a workflow that creates a 100px X 100px thumbnail of our image.


Fig 1: Before Moving the asset, no custom thumbnail. Fig 2: Desired result after moving the asset, the new thumbnail.


The intuitive thought is that when an asset is moved, a new node is created in the new location and the old one is deleted. However, experience shows that AEM does not create a new node in the destination folder on Node Move. We know this because the ‘jcr:Created’ property does not change. AEM does not even change the last modified date.

Creation Timestamp Before Moving the Asset Creation Timestamp is the same after moving

Fig 3: Creation Timestamp Before Moving the Asset. Fig 4: Creation Timestamp is the same after moving

Modification Timestamp Before Moving the Asset. Modification Timestamp is the same after moving

Fig 5: Modification Timestamp Before Moving the Asset. Fig 6: Modification Timestamp is the same after moving.

What if we copy the asset?

On copying the asset, a new version of the same is created. This triggers the Node Creation launcher.

No versions before copying the asset Version created after copy-pasting the asset

Fig. 7: No versions before copying the asset. Fig. 8: Version created after copy-pasting the asset.


Event Listeners

AEM supports observation, which enables us to receive notifications of persistent changes to the workspace. A persisted change to the workspace is represented by a set of one or more events. Each event reports a single simple change to the structure of the persistent workspace in terms of an item added, changed, moved or removed. There are thus 7 possible events at the JCR level, viz:

  1. Node Added
  2. Node Moved
  3. Node Modified
  4. Node Removed
  5. Property Added
  6. Property Removed
  7. Property Changed

We connect with the observation mechanism by registering an event listener with the workspace. An event listener is a class implementing the EventListener interface, that responds to the stream of events to which it has been subscribed. An event listener is added to a workspace with:

(A detailed explanation of each parameter is given with the code example in the package as well as the at the end of this article) As defined by the EventListener interface, listener must provide an implementation of the onEvent method:

When an event occurs that falls within the scope of the listener, the repository calls the onEvent method invoking our logic which processes/responds to the event. In our case, we will register an event listener to listen for “Node Moved” events under “/content/dam/images” so that when an asset is moved to that folder, our workflow can be triggered.


When the component is activated, the activate(…) method is called. It contains a call to ObservationManager.addEventListener(…) for registering the event listener. The deactivate(…) method contains logic for deregistering the event listener, and is triggered when the bundle is being stopped.

When the relevant event occurs, the onEvent(…) method is called, which contains logic for processing the event. In our case, we trigger a workflow.

The following is the relevant code from

Download this code (including the workflow):

Build it using

N.B: Creating a workflow is not part of this tutorial, and therefore a ready workflow has been provided in the code package. However, if you want to learn to create workflows, here is an excellent resource: ->



Adobe Consulting Services. (2018, March 20). acs-aem-samples/ at master · Adobe-Consulting-Services/acs-aem-samples. Retrieved from Github:

Day Software AG. (2018, March 20). JCR 2.0: 12 Observation (Content Repository for Java Technology API v2.0). Retrieved from Adobe Docs:

How to Change Data Type (Typecast) in AEM, Use @TypeHint

Problem Statement:

How to convert a variable from one data type to another data type in AEM.

For a use case where a number field is used in dialog and further data will be utilized in Sightly (HTL) for numeric comparison operations. Problem can come up as data will be stored in String format and comparison can only be made on same data type elements.


@TypeHint: It is used to forcefully define the data type of a property. It can be used as following:


I have a number field in my dialog with name ‘sponsoredPosition’, by default it’s value is stored in String format and I wanted it to be stored in Long format instead of String.


In the component add a node, parallel to the ‘sponsoredPosition’ node (The one of which data type needs to be changed) of the type nt:unstructured.

  1. In the new node add following properties:
  2. ignoreData{Boolean} = true
  3. value{String} = Long
  4. Name{String} [email protected]
  5. sling:resourceType{String} =   granite/ui/components/foundation/form/hidden.


(a) ignoreData, as the name suggests tells value of this field should not be stored.

(b) In value field you must define the data type in which you want your data to be stored.

(c) In Name field add ‘@TypeHint’ suffix to the property name of original node whose value was stored in string(by default) .

(d) Resource type hidden is used for hiding it in dialog.



This blog is intended to provide technical AEM users a solution to an asked question and tactical training on the topic: How to Typecast in AEM.


Adding @TypeHint solved the issue, now value is being stored in Long format instead of String.

Interested in more training and support for your organisation using AEM CMS? Request a consultation to discuss Argil DX managed services.

Performance Troubleshooting

During the initial development phase of a website, not much attention is given to the performance of website i.e. how website will respond to million of requests. Performance means the time your website takes to respond to visitor’s request. But in the later stages of implementation you need to optimize the website to maximize the performance goals.In this blog, we will discuss some measures to increase the page load time so as to provide good experience to users.

Performance Optimization Methodology

Five rules that should be followed to avoid performance issues.

  1. Planning for Optimization

    A project should first be soft-launched to a limited audience in order to gather real-life experience.When the website is live, it is the time when you experience real load on your system.

  2. Simulate Reality

    After the launch of website, if there are some performance issues, it means load and performance tests did not simulate reality closely enough. “Real” means real traffic, real content size and real code.

  3.  Establish Solid Goals

    Establishing good performance goals is a tough task. It is often best to collect real life logs and benchmarks from a comparable website.

  4. Stay Relevant

    Only optimize one things at a time.If you try to do things in parallel without validating the impact of the one optimization, later it will be difficult to figure out which optimization actually helped.

  5. Agile Iteration Cycles

    Performance tuning is an iterative process that involves, measuring, analysis, optimization and validation until the goal is reached.

Page Loading Time
  1. 70% of the requests for pages should be responded to in less than 100ms.
  2. 25% of the requests for pages should get a response within 100ms-300ms.
  3. 4% of the requests for pages should get a response within 300ms-500ms.
  4. 1% of the requests for pages should get a response within 500ms-1000ms.
  5. No pages should respond slower than 1 second.
Performance Guidelines
  1. Mostly dispatcher caching inefficiency and use of queries in normal display templates leads to performance issues.
  2. JVM and OS level tuning does not impact the performance much.Therefore, it should be performed at the very end of the optimization cycle.
Performance Monitoring

To optimize the performance, you need to monitor various attributes of the instance and its behavior.

  1. Backup plan and Disaster recovery plan should be there.
  2. An error tracking system(like bugZilla, JIRA) should be available for reporting problems.
  3. File systems, Log files and Replication agents should be monitored.
  4. Regularly purge workflow instances.
  1. To analyze a bigger request.log, it is recommended to use rlog.jar which allows you to sort and filter according to response times.
  2. AEM includes various helper tools located in : <cq-installation-dir>/crx-quickstart/opt/helpers
  3. One of these, rlog.jar, can be used to quickly sort request.log so that requests are displayed by duration, from longest to shortest time.
Basic Commands
  1. To open request.log in terminal : less request.log or more request.log
  2. java -jar ../opt/helopers/rlog.jar -xdev request.log | less

  • This command tells you the number of requests parsed. Requests are sorted from longest to shortest time.
  • By looking at the result, we can figure out “/cust1.cust.html?page text/html” is not cached
    because its request is coming to the server again and again.
  • “/companyservice/contact.html text/html” is a cacheable page. Even though it is cacheable, it is taking 44s.
  • Non-cacheable pages are taking a lot of time, either publisher is non-responsive or these pages are requesting something from backend server and backend server is down.

3. Now we will see how many times “/companyservice/contact.html text/html” page is rendered.

  • java -jar ../opt/helpers/rlog.jar -xdev request.log | grep “/companyservice/contact.html ” | wc -l
  • This cacheable page is rendered 122 times a day, which is a huge number. It is also impossble to invalidate cache 122 times a day.Therefore, there is some issue with caching configuration.

4. Here we are piping the result into a text file ‘demo.txt’. Now, we can sort the data by date and get to know what is actually happening on publisher.

  • java -jar ../opt/helpers/rlog.jar -xdev request.log > demo.txt
Calculating Cache Ratio

1. Cache ratio means how many requests that come to your system are handled by cache.

2. Dispatcher Cache Ratio

  • (Total Number of Requests – Number of Requests on Publisher )/ Total Number of Requests
  • Remember if you don’t have a 1:1 publisher/dispatcher pairing, you will need to add requests from all dispatchers and publishers together to get an accurate measurement

3. Publisher Cache Ratio

  • Total number of cacheable page request coming to publisher
  • (Total Publisher Request – Total non-cacheable Requests) / Total Publisher Requests

4. Adobe Recommends a Cache Ratio of 90-95% for best performance.

5. The Dispatcher always requests the document directly from the AEM instance in the following cases:

  • If the HTTP method is not GET. Other common methods are POST for form data and HEAD for the HTTP header.
  • If the request URI contains a question mark “?”. This usually indicates a dynamic page, such as a search result, which does not need to be cached.
  • The file extension is missing. The web server needs the extension to determine the document type (the MIME-type).

6. Calculating Dispatcher Cache Ratio Using Access log and Request log

  • wc -l access.log(To get total number of requests to web server). Let this number be 129491.
  • java -jar ../opt/helpers/rlog.jar -xdev request.log | less(To get number of requests on publisher). Let this number be 58959.
  • Therefore, dispatcher cache ratio =(129491 – 58959) / 129491 = 54.5 %

7. Calculating Publisher Cache Ratio

  • java -jar ../opt/helpers/rlog.jar -xdev request.log | less(To get number of requests on publisher). Let this number be 58959.
  • From this we will pull out the requests which are not cacheable using :

    java -jar ../opt/helpers//rlog.jar -xdev request..log | grep -Ev “\?|login|POST” | wc – l

    Let this number be 26855.

  • Therefore, publisher cache ratio = (58959 – 26855) / 58959 = 54.5%

Thus, you can apply performance enhancing mechanisms to your website like caching the content, measuring page load time before its launch to reduce the response time of your website and thus providing a good user experience.

Thank You for enjoying this blog !

Adobe Marketing Cloud

Marketing professionals have to speak the language of their customers. In order to reach them, they have to deliver right content through right channels to the right devices and provide a personalised experience. To achieve this, they need to quantify the facts with huge data.

Online presence must be consistent across devices channels and languages to ensure a personalised experience. They also need to efficiently manage and measure their social marketing activities and optimize their marketing expend on certain display.

So is there a solution that can address these challenges??

Fortunately, there is… The Adobe Marketing Cloud

Leveraging 8 solutions offered by Adobe marketing cloud, one can comfortably master this digital marketing solution.

Let’s elaborate these solutions one by one…

Experience Manager 

It’s quiet difficult to understand demands and to meet out the expectations of every single customer, they are expecting companies to provide the relevant content on all the devices they have been using and they are actually switching from one of these digital devices to another. So they may be shopping on website, looking on things on mobile app and when they go into store they want all of that digital experiences to connect together.

So it’s important for brands to think about how to deliver that consistent continuous experience across all these devices.

Marketers has to manage several websites that’s quite a challenge because website should be adaptive to multiple devices and languages.

The content is inconsistent because assets aren’t organised. Also it is difficult to properly optimise websites all tie in social media which makes content less relevant to customer’s web experience.

That’s where Adobe experience Manager comes in.

Adobe experience manager helps marketer get everything under control. The Adobe solution provides a web content management system featuring a multilingual what you see is what you get mode, so it becomes very easy to manage all the website apps and digital publications.

 AEM has 5 components to it

  1. Sites
  2. Assets
  3. Apps
  4. Forms
  5. Communities

 Each of these are critical to brand journey and building out a comprehensive digital strategy.


People visiting your site have different intentions, some are there to buy stuff, others simply doing research, and some might be trying to get a job with you. Then you got your first time visitors, and returning ones. Some don’t know why they’re there and are looking for a way out.

Despite of this, most websites and landing pages display exactly the same content for all visitors – no matter what they’re searching for or how they got there in the first place. These websites try to appeal to a wide range of visitors simultaneously, and thus are not exciting many.

Wouldn’t it be awesome to display different content and call-to-actions for different types of visitors? Essentially making your visitors feel like that the page speaks their language?
Well there is a grounded leading solution named as ADOBE ANALYTICS 

that provides digital marketers with one place to measure, analyze, and optimize integrated data from all online initiatives across multiple marketing channels. 

It provides marketers with actionable, real-time web analytics intelligence about digital strategies and marketing initiatives.

It works by looking at the data which is provided by analytics about the visitor – things like location, keywords they searched for, ads they clicked in, whether they’ve been to the site before and also things like buying history – and compares that against a set of variables that you have put in place. It lets you target relevant content to different types or groups of visitors based on their behavior or other variables.

Those variables could include:

  • Location – city, country, region
  • Device – iPhone, iPad, Android phone/tablet, Windows, Mac, Linux
  • Search keywords 
  • Visitor frequency – First, second, third, fifth time visitor?
  • Date and time of day, proximity to payday
  • Referring URL – where did they come from?
  • Customer history – have bought before, what, how much did it cost?
  • Sessions behaviour – navigation clicks, page views

There are many more variables which can be taken into account like age, gender etc. – possibilities are pretty much endless and it depends a lot on how much info you have about that customer already.


In this competitive era of digital marketing, every visitor on the website can never be a customer. Every marketing professional wants to communicate well with their visitors by extracting their needs and personalised interest. Every customer expects a unique experience and personalised offers rather than a one side communication by a website. In order to fulfil those expectations, a marketer requires an effective plan to get an optimum amount of marketing expenditure so as to avail a limited number of offers on stuff

Adobe target analyses customer wishes and demands by the data provided by adobe analytics helps to deliver relevant content and tightly integrates with adobe social for enhanced social marketing. Now one can deliver right products to the right targeted customers. This kind of automated behavioural targeting imparts an exponential growth in number of customers

Also it helps to dynamically create content based on the visitor’s profile and to suggest targeted recommendations to the user even before he finishes typing their search queries. External data sources like order history can also be integrated to analyse visitors profile which further helps in optimised merchandising and discount on the targeted stuff. This dynamic navigation guarantees a perfect on site experience 


It is very important to reach out your customers in meaning full ways.

With Adobe campaign you are able to integrate and deliver one to one campaign so all the customers get relevant and consistence experience across online and offline channels.

We come to understand each customer through integrated profile which is made up of the purchasing history social interaction, where they are from and other information gathered by analytics pull into adobe campaign

Adobe campaign use this analytics data to create personalised campaigns according to the customers interest.

Reports in adobe campaign show you how things are going, you can see who is responding to your offers, how much revenue your campaigns is generating.

With adobe campaign you have a deep real time understanding of your customers so you can design, integrate and automate cross channel experiences and measure the success. All for one solution


Social media manager are responsible for increasing brand reach and audience engagement through social and creating a consistent brand experience across channels and devices

They must also ensure the right message gets through all the noise of social traffic to the right customers and you need to respond to customer issues that arise via social platforms. And you need to connect all these social efforts to measurable business results

How one possibly monitor manage and measure so many social media channels and also quantify the business value of social marketing.

Easy. All we need is Adobe Social

Adobe social enables marketers to scale social across the organisation; create and deliver relevant, engaging content across all social properties; capture and analyse all key sources of meaningful social data alongside integrated conversion metrics; leverage data to demonstrate social value and optimise interactions; and manage and strengthen customer relationships across social channels – All from one integrated platform

Audience Manager

What problem are we trying to solve with Audience Manager?

Marketers had multiple database and technologies that they use to solve personalisation across each channels.

There are lot of technologies, customers can be on web, mobile, email or social. The challenge here is database is touching into multiple channels. Marketer here really dint have a coherent or a relevant conversation as a consumer moves across multiple touchpoint. And this is the reason for disjoint experience. 

And this something which can be address by DMP

A data management platform or Audience Manager consolidate audience information from all available sources. It identifies, quantified and optimises high value target audiences which can then be offered to advertisers via an integrated, secure, privacy-friendly management system that works across all advertising distribution platforms

 DMP consist 4 key part

  1. Collect data – Collect data and allows that to sit in single repository.
  2. Unify Data to a profile – DMP can take this data from the different sources and unify that data back to a profile
  3. Create Audiences – In DMP, you can take the various profile information that you have to create audiences and audience segments.
  4. Push to Marketing platforms: Value you creating this segments is the capability to reach the people who follow the segments
Adobe Media Optimiser

Once building a great website is done, it is time think about MONEY

Marketers really need to get a larger return on their ad spend to engage more relevant audiences derive brand awareness and generate revenue.

But how one should distribute their media budget between display, search and social

What is the most profitable way to run across channel advertising campaign.

Advertisers are trying to simplify their lives, they are trying to have more control to right campaigns and their budgets and trying to be more efficient with their ad buys.

With Adobe Media optimiser we can answer all these questions easily and get maximum return on our investments.

Using mathematical models and industry leading algorithm adobe media optimiser preciously evaluates the situation which enables us to adjust our strategies, goals and budget in real time

It also provides precise forecast based on our goals and predicts the best channels mix for his budget

With adobe optimiser marketers can manages all of his campaigns for every platform.

Search engine like google & Bing. Social platforms like Facebook and Twitter

So Marketers can always rely on peak conversions and the right investments

The right messages to right people at the right time

so finally not only traffic on website will increase but the revenue will also grow profitably


The way people watch TV and video has evolved. 

You need to personalize it.

Users want to jump in and only experience they want to be fully immersed. This means making it easy for each viewer to have the TV shows and sports and films they want, streaming securely and high definition across different screen wherever and whenever

Adobe primetime part of adobe marketing cloud helps delivering helps delivering engaging experiences for viewers and subscribers and more revenue from advertising and subscription

Now its possible to create engaging experiences on every device using primetime 

Primetime allows you to deliver the right ads to right audience to improve the effectiveness of your ad sells


We all know key to mastering customer experience is knowing the customer and for this Adobe Marketing cloud is perfect fit. My vote goes for Adobe Marketing cloud.

What about you????

Credits: Adobe Sites