Long-lived token support

I’m trying to integrate the simplest possible flow into our CRM. I want to update a Contact when a certain event happens in the system.

Infusionsoft is making this so terribly difficult.

Is it really that hard to support long-lived access or at-least refresh tokens in your OAuth implementation? It is insane that I have to build out a whole token storage and refresh mechanism just to integrate your service into ours. It’s like you guys forgot that some people actually build out custom integrations that won’t be shared on your Marketplace.

Judging by the amount of other people that already complained about this, I just can’t understand why such a simple feature does not already exist.

1 Like

Good afternoon, @Igor_Rinkovec!

I’m going to refer back to an earlier post, as this question comes up occasionally:

Hello @TomScott,
my request however does not have anything against OAuth2. OAuth2 is awesome. What is not awesome is not having the ability to generate a long-lived refresh or access token.

Imagine the scenario:
I have an integration that does something with your API. To make that work I need to implement a token chain and caching in a way that the chain can be started using the OAuth authorization flow, and then continued using refresh token when the tokens expire.

To start the authorization flow on a backend system without a UI, I have to manually go to the authorization URL, get the code and somehow insert it into my system so it can continue the refresh token chain.

Now every time the refresh token expires I have to manually enter a new code into the system?! I might as well never did the automation of the process if it is still blocked by human interaction.

What happens if because of a race-condition the token get refreshed twice, but the first token ends up persisted? Now I also have to worry about thread safety on something as simple as updating a freaking contact in a CRM.

What OAuth2 service providers usually (always) do is provide administrators ways of generating long lived tokens for apps that are not public. In that way we don’t have to worry about all of the above for something as simple as a cronjob.

1 Like

@TomScott, you and your team need to have a discussion to work out how to resolve these onging OAuth issues.

You mention that the Legacy API Key has many failings. You do realize that your development team made the original API key mechanism and you can easily fix all those problems. As far as I know the API Key was introduced in 2006, and it was not until 2014 (8 years later) that the OAuth Authorization was introduced. Now its 2018, it means the Legacy API Key has been in use for 12 years, and OAuth for 4 years.

The solution to most of the Legacy API Key issues that you reported can be resolved if you gave clients the ability to create Multiple API Keys. If a client generate a new API Key they can give it to the developer, or use it in a particular piece of software. If they needed more, they can go back to their account and repeat again. If they neeed to deauthorize the API Key, they could just delete that key. In regards to the other issues, that is something you can resolve, but I do not see how its been an issue after all these years.

If you have issues, you do not let it carry on for years and years, and then try to introduce something new to think it would resolve things. Actually you guys have gone from one set of issues (which can be resolved), to another set of issues with OAuth (which you cannot easily resolve now you use a 3rd party service).

In regards to OAuth, you guys think its perfect, reliable and easier to use. Sorry, but I disagree with that. Using OAuth makes things more complicated to develop and less reliable as well. I can understand from your point of view you wand to increase security, but did it have to take 8 years to come to that point? If it was an issue, it should have been resolved a long time ago before 1000s of API scripts have been developed.

I need to point out that using the Legacy API Key works in ALL Software Enviroments, be that being a Web Script, Desktop Application, Wordpress Plugin, Spreadsheet Macros, etc. But using OAuth generally requires the use of a Web Browser to authorise the connection, unless you go down the manual route. OAuth adds more complexity because you have to deal with the Tokens, which generally require a Cron Job to keep refreshing the Access Token. There is developers who make Desktop Applications and Wordpress Plugins, they would have to resort to developing Proxy Servers to deal with the Tokens. As you see those sort of complications do not help things, and can actually limit creativity.

You do realise that OAuth is a framework and not a protocol. Quoted from the Wiki: “Because OAuth 2.0 is more of a framework than a defined protocol, one OAuth 2.0 implementation is less likely to be naturally interoperable with another OAuth 2.0 implementation. Further deployment profiling and specification is required for any interoperability.”.
Unfortunately this had lead to companies creating dozens of variations of OAuth which range from the Good, the Bad and the Ugly. I have seen other OAuth implementations that have an Access Token that last forever, or 12 months, or 2 weeks. But usually the SDK has been designed to deal with refreshing the token. Unfortunately Infusionsoft SDK does none of that, and the devleoper has to deal with it.

As Igor pointed out, what happens when you have a Race Condition occuring, which can happen. What happens if the server went offline or was rebooted when the Cron Job was meant to run to refresh the Access Token. Then the script will stop working and would require to be re-authorised again. How would clients feel if that occured on a Friday night, and no one was able to fix it until after the weekend? In comparison to using the Legacy API Key that scenario would not be an issue.

In regards to moving forward, please explain why does the New Infusionsoft Interface still support the Legacy API Key? If you were moving forward, then that section would have been removed when the new interface was introduced.


@Igor_Rinkovec and @Pav, there is a lot of stuff in this post, and I will try and address most of it. I want to first say we are all on the same team. We understand some of the frustration. We are trying to address some of these issues, and some will not be addressed. No one on our team has claimed that our implementation is perfect. The are always trade offs. Usually the two levers are security and convenience. In our case we have restraints on what we can and can’t do. There are several reasons why certain decisions have been made and certain actions taken. These include various things like PCI compliance, internal architectural restraints, staffing, budget, internal priorities, etc.

With that being said, here is what I have to say about a few of the issues in this thread. I am going to try and be thorough for anyone else that might come by this thread later.

Refresh Tokens

  • Refresh Tokens are single use but long lived (6 months)
  • When a refresh token is used the response includes a new refresh token that is good for an additional 6 months
  • This allows for indefinite refreshes with zero user interaction after the initial authorization.
  • Threading. This is part of life as developer. There are few options to handle the scenario of refreshing tokens in a multi-threaded environment.
    • Synchronize the call to get the token from your own storage. If it is expired then refresh the token inline. Then persist it. (This is my preferred approach)
    • If you don’t want to worry about threading you can build in some error handling. If two threads attempt to refresh the token one will succeed and one will fail. In the case of failure just refetch the token from storage (it should be the new one).
    • Cron - Refresh tokens in the background. If you want to protect against a server shutdown then you need to make your cron more resilient. We suggest that if you go the cron route then that cron needs to run often and refresh tokens ahead of expiration. For example run the cron every hour and refresh all tokens that will expire within 6 hours. This will give you 6 tries before the token actually expires.

User Interaction

  • The Authorization Code Grant is intended for Infusionsoft users authorizing 3rd parties.
  • We understand that the use case of an app owner wanting API access to their own application is not ideal for Authorization Code Grants.
    • In these scenarios there are many ways of generating the initial tokens.
      • Put yourself through the authorization
      • Use of 3rd party tools, such as Postman, can aid in running this generation. It is what we use in-house.
      • Generate a token using Account Central via API Access please note that there is an open known defect with Partners generating tokens using this method).
      • Personal Access Tokens will be shipping very shortly. These will still be standard OAuth tokens that need to be refreshed.


  • We have been working on autogenerating SDKs based on our OpenAPI spec.
  • We are hoping some sort of token management will be rolled into that.
  • It will also include additional languages. We have a Java version that is in alpha.

New Infusionsoft

  • Legacy API keys will be going away sooner than later. We expect to have announcements on this in the near future.
  • New Infusionsoft is basically a new front end to the existing backend. Since API calls go to the backend we didn’t want existing integrations to fail for New Infusionsoft apps. So we made the decision to keep it for now. We didn’t want to slow up delivery of New Infusionsoft while we migrate existing integrations off of legacy keys.
  • We are working on a new authentication mechanisms based on OAuth 2.0 to alleviate issues with moving away from legacy keys.

Proxies for Wordpress (and similar applications)

  • This is a unique case in which embedding your client_id and client_secret is not secure. The source is usually available for anyone to see. Browser extensions fall into this category as well.
  • We hope Personal Access Tokens will solve this problem. We are in the process of putting on the final touches and building out documentation.
  • There will be a specification that will allow this to be pretty seamless for users.

Hopefully I have answered a lot of the concerns. Feel free to ask me to clarify anything and I will do the best I can.



Hi @bradb, thanks in coming back with a more detailed response.

As a developer who has been using the API for 8+ years you can see how from our point of view how awkward / messy / difficult the OAuth implementation has made things. I run all my API scripts using XML-RPC with the Legacy API Key, because I know it works, its stable, and less things can go wrong. But going to OAuth adds a whole level of complication, and the number of possible failure scenarios jump up considerably.

I need a bit more clarity about the Tokens here, because either something changed from its original implementation, or maybe the documentation was not clear enough. I understand that the Access Token works for 24 hours. In the past if you asked to get a new Access Token, you would get the new one, plus the new Refresh Token. The previous Access Token would last around about 1-2 minutes, before it was terminated. You are saying that if the Cron Job retries 6 times before the Access Token expires. So that means that the previous Access Token is now expiring at its designated time. If that is the case, then that definitely improve things. When did that start occurring, because it was not like that 2 years ago?

I think we need to see the Personal Access Tokens in operation to see how it improves things. Hopefully if they are what they are expected to be, then developers will not need to build proxy servers to handle the tokens. If they hold up, it will cause more legacy scripts to be converted across to OAuth.

I know you also want developers to move away from XML-RPC to use REST. But that is far from complete, and no way on equal level as the XML-RPC functionality. But you do realise that there is literally 1000s of scripts using XML-RPC and not something you can easily convert across. There are scripts that still use the old iSDK, but no longer supported, so that would be problematic to convert. I am thinking that is going to take at least 5 years before you would close that down completely.



I’d like to also chime in here to 100% echo @Pav’s comments.

One additional point, is that implementing 2-legged OAuth (instead of forcing 3-legged OAuth) would greatly alleviate the pains of working with the current implementation. Google firebase, for example, implements “service accounts” for back-end 2-legged environments that are thread-safe (each server instance uses its own auto-refreshing tokens to interact with the firebase API).

Right now, to get this working, I have to create 3 separate infusionsoft developer apps (for dev, stage, and production), manually authorize each of them with my infusionsoft account, and each of which needs to store the access tokens and refresh tokens in their respective DB’s, with retry-logic to try to avoid race condition token failure scenarios when I have numerous server instances running.

Compare this to the firebase-admin npm package, which simply requires me to do the following at app start (happens in each instance), with no additional configuration. No extraneous db records, no manual token refresh management, etc.):

  credential: firebaseAdmin.credential.cert({
    projectId: config.firebase.projectId,
    clientEmail: config.firebase.clientEmail,
    privateKey: config.firebase.privateKey.replace(/\\n/g, '\n'),

InfusionSoft has really dropped the ball on this, and I would really appreciate official support for the old API token authentication scheme until 2-legged OAuth with thread-safe token refreshing is implemented. To do otherwise is extremely developer punitive.

Hi @bradb.

It’s been over two years since you mentioned Personal Access Tokens.

Do you have an update on this?

Brad responded in this thread and said it’s coming this quarter!