How We Instrumented Tracing for Vercel AI SDK In Langtrace

Rohit Kadhe

Software Engineer

Aug 6, 2024

Introduction

We faced an interesting challenge while adding support for tracing the Vercel AI SDK to Langtrace. This was tricky due to the immutability of the Vercel AI SDK module, causing traditional monkey patching to fail. This blog post will walk you through our journey, from initial attempts using proxy objects to our final elegant solution leveraging Webpack.

Understanding the Challenge

What is Monkey Patching?

Monkey patching refers to modifying or extending code at runtime without altering the original source code. It's a powerful technique used to add or modify functionality dynamically. In TypeScript, this often involves replacing or extending functions in a module.

Initial Attempt: Proxy Objects

Proxy Objects are a feature in TypeScript that allows you to create an intermediary to control interactions with another object. This can be useful for adding behaviour to existing objects without altering their structure.

Here’s a simplified example:

const target = {
  message: "Hello, World!"
};

const handler = {
  get: (obj, prop) => {
    if (prop === "message") {
      return `${obj[prop]} - intercepted by proxy`;
    }
    return obj[prop];
  }
};

const proxy = new Proxy(target, handler);

console.log(proxy.message); // Output: Hello, World! - intercepted by proxy

Initially, we attempted to use proxy objects to instrument the Vercel AI SDK. The idea was to return a proxy in the patch function, which would then be used by OpenTelemetry. Here’s a snippet of the code:

init (): Array<InstrumentationModuleDefinition<any>> {
  const module = new InstrumentationNodeModuleDefinition<any>(
    'ai',
    ['*'],
    (moduleExports, moduleVersion) => {
      diag.debug(`Patching Vercel AI SDK version ${moduleVersion}`);
      const proxy = this._patch(moduleExports, moduleVersion);
      return proxy;
    },
    (moduleExports, moduleVersion) => {
      diag.debug(`Unpatching Vercel AI SDK version ${moduleVersion}`);
      if (moduleExports !== undefined) {
        this._unpatch(moduleExports);
      }
    }
  );
  return [module];
}

The Problem with Next.js

While the proxy approach worked well in a Node.js environment, it broke down in Next.js. The root cause? The difference between CommonJS modules and ES modules.

CommonJS vs. ES Modules

  • CommonJS: Used primarily in Node.js, where modules are loaded synchronously using require.

  • ES Modules: Used in modern TypeScript, including Next.js, where modules are loaded asynchronously using import.

OpenTelemetry doesn’t automatically wrap ES modules during import, necessitating manual patching. However, manual patching only works if the module is mutable, which brought us back to our initial problem.

A New Approach: Manual Patching with a Wrapper

Taking a step back and restarting made us realize a simple yet inelegant solution was simply wrapping the original module and having users import the wrapper instead. This looked something like this:

import ai from '@langtrace-module-wrappers/ai';

While this worked, it required users to remember the alias, complicating their existing codebase. There had to be a better way!

Webpack to the Rescue

Building on the idea of a wrapper, we turned to Webpack, which allows writing plugins that plug into different stages of the compilation process. Here’s how we did it.

Webpack Configuration

Users add the following to their Webpack config:

import { ModuleAlias } from '@langtrase/typescript-sdk/dist/webpack/plugins/ModuleAlias.js';

const nextConfig = {
  webpack: (config, { isServer }) => {
    if (isServer) {
      config.resolve.plugins = [
        ...(config.resolve.plugins || []),
        new ModuleAlias(process.cwd())
      ];
      config.module.rules.push({
        loader: "node-loader",
        test: /\\.node$/,
      });
      config.ignoreWarnings = [{ module: /opentelemetry/ }];
    }
    return config;
  },
};

ModuleAlias Plugin

Here’s the code for the ModuleAlias plugin:

import { Vendors } from '@langtrase/trace-attributes';
import { resolve } from 'path';
import { Resolver } from 'webpack';

export class ModuleAlias {
  supportedModuleAliases: string[];
  cwd: string;

  constructor(cwd: string) {
    this.supportedModuleAliases = ['ai'];
    this.cwd = cwd;
  }

  apply(resolver: Resolver): void {
    resolver.getHook('before-resolve').tapAsync(this.constructor.name, (request, _context, callback) => {
      if (this.supportedModuleAliases.includes(request.request as string)) {
        const modulePath = request.path;
        if (typeof modulePath === 'string' && modulePath.includes('node_modules/@langtrase/typescript-sdk')) {
          if (request.request === Vendors.VERCEL) {
            request.request = resolve(this.cwd, `node_modules/${Vendors.VERCEL}`);
          }
        } else {
          if (request.request === Vendors.VERCEL) {
            request.request = resolve(this.cwd, `node_modules/@langtrase/typescript-sdk/dist/module-wrappers/${Vendors.VERCEL}.js`);
          }
        }
      }
      callback();
    });
  }
}

Wrapping the Module

The magic happens when the wrapper module is imported:

const originalModule = require('ai');
const ai = Object.assign({}, originalModule);

module.exports = ai;
export default ai;

This wrapper makes the original module mutable, allowing it to be patched without any change in functionality from the original module.

Conclusion

With this setup, every time a user imports the Vercel AI SDK, they get the wrapper module instead, seamlessly instrumented for tracing. This solution ensures minimal disruption to the existing codebase while providing the desired functionality.

For a full example and detailed documentation, visit our Langtrace Documentation.

Final Thoughts

Instrumenting the Vercel AI SDK for tracing with OpenTelemetry was a challenging yet rewarding experience. By leveraging Webpack and creating a custom wrapper, we achieved a seamless solution that integrates effortlessly with existing codebases. We hope this journey and solution inspire you in your tracing endeavours.

Feel free to reach out with any questions or feedback. Happy tracing!

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers