Datadog dd-java-agent Integration


Has anyone done anything with Lucee and Datadog at all?

We’ve installed the dd-java-agent.jar file in the lucee/lib directory.
Updated the file so that it’s loaded as a javaagent “-javaagent:lib/dd-java-agent.jar”.
On loading Lucee the console output shows it loading and running

[main] INFO datadog.trace.agent.ot.DDTraceOTInfo - dd-trace - version: 0.12.0~8bed6011
[main] INFO datadog.trace.agent.ot.DDTracer - New instance: DDTracer-13d9b21f{ serviceName=jedi-dd-test, writer=DDAgentWriter { api=DDApi { tracesEndpoint=http://localhost:8126/v0.3/traces } }, sampler=AllSampler { sample=true }, defaultSpanTags={}}
[main] INFO datadog.trace.agent.tooling.VersionLogger - dd-trace-ot - version: 0.12.0~8bed6011
[main] INFO datadog.trace.agent.tooling.VersionLogger - dd-trace-api - version: 0.12.0~8bed6011
[main] INFO datadog.trace.agent.tooling.VersionLogger - dd-java-agent - version: 0.12.0~8bed6011

However whenever we attempt to run any MS SQL calls (using the Microsoft Vendor driver) we always get

lucee.runtime.exp.NativeException: io/opentracing/Scope at at at lucee.runtime.type.QueryImpl.execute( at lucee.runtime.type.QueryImpl.<init>( at lucee.runtime.tag.Query.executeDatasoure( at lucee.runtime.tag.Query._doEndTag( at lucee.runtime.tag.Query.doEndTag( at at at jedi.index_cfm$ at lucee.runtime.PageContextImpl._doInclude( at lucee.runtime.PageContextImpl._doInclude( at lucee.runtime.listener.ModernAppListener._onRequest( at lucee.runtime.listener.MixedAppListener.onRequest( at lucee.runtime.PageContextImpl.execute( at lucee.runtime.PageContextImpl._execute( at lucee.runtime.PageContextImpl.executeCFML( at lucee.runtime.engine.Request.exe( at lucee.runtime.engine.CFMLEngineImpl._service( at lucee.runtime.engine.CFMLEngineImpl.serviceCFML( at lucee.loader.engine.CFMLEngineWrapper.serviceCFML( at lucee.loader.servlet.CFMLServlet.service( at javax.servlet.http.HttpServlet.service( at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( at org.apache.catalina.core.ApplicationFilterChain.doFilter( at org.apache.tomcat.websocket.server.WsFilter.doFilter( at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter( at org.apache.catalina.core.ApplicationFilterChain.doFilter( at org.apache.catalina.core.StandardWrapperValve.invoke( at org.apache.catalina.core.StandardContextValve.invoke( at org.apache.catalina.authenticator.AuthenticatorBase.invoke( at org.apache.catalina.core.StandardHostValve.invoke( at org.apache.catalina.valves.ErrorReportValve.invoke( at org.apache.catalina.valves.AbstractAccessLogValve.invoke( at org.apache.catalina.core.StandardEngineValve.invoke( at org.apache.catalina.connector.CoyoteAdapter.service( at org.apache.coyote.http11.AbstractHttp11Processor.process( at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process( at$SocketProcessor.doRun( at$ at java.util.concurrent.ThreadPoolExecutor.runWorker( at java.util.concurrent.ThreadPoolExecutor$ at org.apache.tomcat.util.threads.TaskThread$ at Caused by: java.lang.NoClassDefFoundError: io/opentracing/Scope ... 44 more Caused by: java.lang.ClassNotFoundException: io.opentracing.Scope not found by [50] at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation( at org.apache.felix.framework.BundleWiringImpl.access$200( at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass( at java.lang.ClassLoader.loadClass( ... 44 more

Compaining that “io.opentracing.Scope” ClassNotFoundException.

I can manually createObject(“java”, “io.opentracing.Scope”) and this returns an object, but the SQL calls still fail.

Has anyone got any ideas?

1 Like

I’d be interested to hear about your progress with this. We use Datadog for server / container metrics but haven’t yet tried the Java agent.

This may not be particularly helpful, but out of curiosity have you tried the jTDS MSSQL driver?

The MS SQL driver is in an extension (OSGi ) so I guess it’s possible that it can’t see the opentracing packages. Can you also try copying the JAR into /usr/local/tomcat/lib/ to see if that helps? Otherwise I’m not sure, might need some input from @micstriit :slight_smile:

I’ve basically put it on a back-burner at the moment. I believe it’s OSGi related.

I can install it and run it on a lucee4 express, but not a lucee5 express.

I’ve tried the MS Vendor, jTDS and the Postgres drivers, each report the same issue.

On a Datadog channel it’s been suggested that the issue is:

The problem is likely that felix is not delegating to the bootstrap classpath correctly. Maybe there’s a way to configure what packages should be on the bootstrap classpath?
The agent adds those classes to the bootstrap to ensure they’re accessible on any classpath that delegates to the bootstrap.

However I’m unsure how I can play with that.

I attempted to convert the dd-agent.jar to an OSGi bundle and include it in the bundles directory of lucee, but still had the same issues.

@micstriit Do you have any advice on how to resolve this?

I understand that Lucee is supposed to have a datadog integration built in now. @Zackster has it been pushed to the current release and, if so, do you have any more information on how to set this up?

I’m chasing up the info.

There’s a custom logging layout for datadog (no admin ui yet), available in 5.3.9 and the later 5.3.8 SNAPSHOTs like

<logger appender="resource" appender-arguments="path:{lucee-config}/logs/mapping.log" layout="datadog" level="error" name="mapping"/>

The agent gets installed via JVM args and has the following options


Tracking deets here

I’m not sure what to do with this info. Hopefully, it will be worked into the admin UI soon. I could really use this right now…

The only aspect which will be added (soon) to the admin UI is the assigning of the datadog layout

As the DD agent has to be configured at the JRE level using the above parameters, Lucee doesn’t provide an admin interface for such parameters.

The JIRA task has further links explaining that configuration

Thank you. I’m denser than most - a good real-world example is worth 1000 pictures for folks like me if you can spare a few extra minutes.

I have literally provided you with a real world example of the config required!

If you can’t spare a few minutes to try it out yourself…

From where do these values come? To what do they pertain? How are they populated?

sorry, see the task in jira for updated info