How to use Workbox Background Sync in an Web App for offline post requests - javascript

I am trying to create an webapp which supports offline post requests.I used workbox to precache my files but the plugin Background Sync doesnt work.I didnt see the queued requests in IndexedDB with Chrome Dev Tools. What is my mistake?
I successfully installed a service worker with workbox which caches the app shell and makes the app run offline. Then i followed the instruction of the plugin Background Sync and tested it with the testing instruction on
Load up a page and register your service worker. -> This worked.
Turn off your computers network or turn off your web server( NOT USE
Make network requests that should be queued with Workbox Background
Sync -> This is where it fails.
• You can check the requests have been queued by looking in Chrome DevTools > Application > IndexedDB > workbox-background-sync > requests
Now turn on your network or web server.
Force an early sync event by going to Chrome DevTools > Application > Service Workers, enter the tag name of workbox-background-sync: where "" should be the name of the queue you set and then clicking the 'Sync' button.
You should see network requests go through for the failed requests and the IndexedDB data should now be empty since the requests have been successfully replayed.
I first tried to send the posts to an firebase server. Then i tried with wehere you can check for received posts(i disabled core in my request to make it work on this second server).
In service worker file:
{ url: 'index.html', revision: '0000' },
{ url: 'scripts/app.js', revision: '0000' },
{ url: 'manifest.json', revision: '0000' },
{ url: 'images/icons/icon-48x48.png', revision: '0000' },
//On you can check for received posts
const bgSyncPlugin = new workbox.backgroundSync.Plugin('queue', {
maxRetentionTime: 24 * 60 // Retry for max of 24 Hours
new workbox.strategies.NetworkOnly({
plugins: [bgSyncPlugin]
In js file, where i create the post qeuests:
function sendPost() {
console.log("Send post...");
fetch('', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
body: JSON.stringify({
message: 'hello world'
}).then(function (res) {
console.log('Sended data', res);
}).catch(function (error) {
console.log('Error while sending data', error);
The Project(its not so much, just the minmum to make the code work) can been found here:!AlqUUpItItUPgZhNmLkm84KrInRqSQ


ServiceWorker handling fetch events to URLs outside scope

My site registers a ServiceWorker which is scoped to only URLs beginning with /sw/....
* Register the Service Worker.
if ('serviceWorker' in navigator) {
.register('{{ URL::asset('sw/serviceworker.js') }}', {scope: './sw/'})
.then(registration => {
console.log("SW registered. Scope: ", registration.scope);
}).catch(err => { console.error("SW Register failed: ", err); });
One of the pages inside the /sw/... path performs a fetch to the server to see if connection to the server is available. The address it fetches is /ping, a simple page that returns some JSON. Note the address /ping/ is not inside the /sw/... path).
// Sample of the bit inside my promise that checks for the server
// this is the request that is being cached
.then(function(response) {
if (response.status == 200) {
console.log('%c Server available! ', 'background: #20c997; color: #000');
.catch(function(err) {
console.log('fetch failed! ', err);
Yet the browser clearly shows the serviceWorker intercepting the request to /ping.
From the Google Chrome Dev Console:
▶ Fetching Request {method: "GET", url: "", headers: Headers, referrer: "", referrerPolicy: "no-referrer-when-downgrade", …} serviceworker.js:105
▶ Fetched over network Response {type: "basic", url: "", redirected: false, status: 200, ok: true, …}
This is not doing what I expect because I only want the ServiceWorker to intercept requests to addresses starting with /sw/...
Is there somewhere in the spec or intended behaviour of ServiceWorkers that says it can cache the responses to fetch events made by pages in-scope, even if the address it is hitting is out of scope?
Answering my own question...
Yes, this is intended behaviour. Scope is not determined by the address that the fetch is being made to, but the address of the page making the request.
The key word is "from" in this quote from Google's Guide: will handle fetch and message events that occur when a network request or message is made from your page.
So if the JavaScript on /sw/page1.html makes a fetch() request to /ping then this is considered "in scope" by the ServiceWorker because the page sending the request starts with /sw/....
Scope also includes the original request made by the browser to fetch the page. So if the browser attempts to navigate to pages starting with /sw/... and there is a ServiceWorker registered then it will handle that request.

React Native - Unable to Upload Image Via Axios Post (Android)

I'm unable to upload an image from my React Native application in Android (iOS works fine). On attempting to upload, I'm receiving the following error:
Error: Network Error
at createError (buildURL.js:33)
at XMLHttpRequest.handleError (xhr.js:144)
at XMLHttpRequest.dispatchEvent (event-target.js:172)
at XMLHttpRequest.setReadyState (XMLHttpRequest.js:576)
at XMLHttpRequest.__didCompleteResponse (XMLHttpRequest.js:392)
at XMLHttpRequest.js:505
at RCTDeviceEventEmitter.emit (EventEmitter.js:190)
at MessageQueue.__callFunction (MessageQueue.js:344)
at MessageQueue.js:107
at MessageQueue.__guard (MessageQueue.js:291)
All other requests work, connection is configured to my local static IP address, as I'm testing on a physical device, rather than in a simulator.
I've looked at a number of solutions already:
Suggests to add a type filed to data, which I had already (detailed below)
Suggests to use IP instead of localhost, which I have been doing since the start
Here is the code that handles this:
ProfileImage.js (wrapper for react-native-image-picker):
async onHandleResizedImageUri(image) {
var data = new FormData();
data.append("profile_image", {
uri: image.path,
type: "image/jpeg"
let imageUploadSuccess = await global.api.callPostImage("/api/profile-image", data);
async callPostImage(url, data) {
try {
const settings = {
baseURL: config.api_host,
timeout: 1000,
headers: {
"content-type": "multipart/form-data"
const response = await, data, settings);
return response;
} catch (error) {
Also, this issue is happening in both debug and release mode, and for multiple servers (local, staging and production). All other network requests work fine, but this one will not complete. Anyone have any suggestions?
Device information:
Samsung Galaxy A5 (2017)
Model SMA520W
Version 8.0.0

How can I make http call to DialogFlow V2 using simple ajax jQuery?

I have been using DialogFlow v1 before using simply jquery and it was pretty straigh forward working!
Now that I have to switch to V2 I am stuck on how to keep somehow same code but just modify with the V2!
I have been looking at this client library for V2:
But I dont wanna use Node.js I just dont want to do somthing like node server.js to run the app, also I am not sure if I can mix jQuery with Node.js.
My previous code v1 looked like this:
fetch(url, {
body: JSON.stringify(data),
// cache: 'no-cache',
// credentials: 'same-origin',
headers: {
'content-type': 'application/json',
"Authorization": "Bearer " + configs.accessToken,
method: 'POST',
mode: 'cors',
redirect: 'follow',
referrer: 'no-referrer',
.then(response => response.json()) // parses response to JSON
Well I swtiched to ES6 for making http request for dialogflow but I would want the same code to use for V2, is this possible? Also I can no longer see access token for v2, how are we suppose to handle the auth for http calls?
I am really confused with the new V2 and since we switched to Enterprise Edition Account it is a must for us to use v2 and it kinda sucks!
I am checking this example from documentation:
Authorization: Bearer $(gcloud auth print-access-token)
Content-Type: application/json
POST body:
'displayName': 'StartStopwatch',
'priority': 500000,
'mlEnabled': true,
'trainingPhrases': [
'type': 'EXAMPLE',
'parts': [
'text': 'start stopwatch'
'action': 'start',
'messages': [
'text': {
'text': [
'Stopwatch started'
But I am somehow confused on this part: Authorization: Bearer $(gcloud auth print-access-token) where do I get access-token?
I have already done this part: gcloud auth activate-service-account --key-file= which I have no idea what is it doing after activating! I was hoping I would get back some access-token from this, but there seem to be nothing just a message that says Activated Service...
First of all, Dialogflow V1 API is not going away anytime soon. They do not have a definitive time line to stop the API. In case, they have decided, the developers will be notified with the deadline (confirmed by their support team). I guess you should be ok to use till then.
However, if you have decided to use Dialogflow V2 API with browser AJAX just like V1, there is no simple way, unless you have the access token. I've run into same issue and figured out it can't be done without using their client libraries (SDK) or "google-oauth-jwt". In my example i used nodejs - google-oauth-jwt package which provides "access token" for my application which was used for browser AJAX calls. You don't have to use their nodejs SDK library, in case, you're handling logic on client side.
Setup Instructions:
1.Configure V2 API from V1 on the dialogflow account, follow the migration guide. Download the JSON file which has a unique email and key values. You might want to grant access to your application by registering the domains.
2.Create a nodejs application and use "google-oauth-jwt" to get the access token. Also, make this as a service to call it before hand to have the access token ready before making any ajax calls. Here is sample code:
app.get("/your_sample_web_service_to_get_access_token", (req, res, next) => {
new Promise((resolve) => {
//find this email value from the downloaded json
email: '',
//find this key value from the downloaded json
key: '-----BEGIN PRIVATE KEY-----xxx',
//specify the scopes you wish to access: as mentioned in dialogflow documentation
scopes: ['']
(err, token) => {
//rest api response
"access_token": token
3.From your client JavaScript, make an AJAX call using the access token you get from above nodejs application. Here is the sample code:
app.service('chatbot', function ($http, $rootScope) {
this.callAPI = function (user_entered_query) {
//I used detectintent REST API endpoint: find the project name from your account.
var endpoint = "";
var data = JSON.stringify({queryParams:{}, query_input:{text:{text:user_entered_query,language_code:"en-US"}},outputAudioConfig:{},inputAudio:""});
var headers = {
//use the token from nodejs service
"Authorization": "Bearer " +$rootScope.token
return $, _data, {"headers": headers});

Background Sync codes not working automatically when online(wifi on) in PWA

I am new to PWA and have been testing my PWA project using firebase console database. When offline, I have code to save my post data in indexedDB when i submit my post data to saved later when there is WiFi(online). It did save the data in indexedDB when no WiFi found, but when i turn on my WiFi, it doesn't post my data in realtime. When i submit new post data when wifi on(online), background sync codes do post saved data from indexedDB with newly post data in real time. But i want my background sync codes to post automatically when WiFi is turned on(after offline).
Here is my service worker code for background sync:
self.addEventListener('sync', function(event) {
console.log('Background syncing...', event);
if (event.tag === 'sync-new-posts') {
console.log('Syncing new Posts...');
readAllData('sync-posts') // function to read all saved data which had been saved when offline
.then(function(data) {
for (var dt of data) {
fetch('xxx some firebase post url xxx', { // fetching for sending saved data to firebase database
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
body: JSON.stringify({
title: dt.title,
content: dt.content
.then(function(res) {
console.log('Sent data', res);
if (res.ok) {
.then(function (resData) {
deleteItemFromData('sync-posts', resData .id); // function for deleting saved post data from indexedDB as we donot need after realtime post is saved when online.
.catch(function(err) {
console.log('Error while sending data', err);
I don't know what's go wrong. If anyone need more of my codes of my posting codes or serviceworker codes for more clarity, please do ask. Please help me with this as i am stuck in this.
What you can do is check weather the your app is online again or not using Online and offline events. This is a well documented JS API and is widely supported as well.
window.addEventListener('load', function() {
function updateOnlineStatus(event) {
if (navigator.onLine) {
// handle online status
// re-try api calls
console.log('device is now online');
} else {
// handle offline status
console.log('device is now offline');
window.addEventListener('online', updateOnlineStatus);
window.addEventListener('offline', updateOnlineStatus);
NOTE: It can only tell if the device is connected. BUT it CANNOT distinguish between a working internet connection or just a connection (Eg. WiFi hotspot without actual Internet connectivity).
So, I'd suggest you to do a fake API call in the navigator.onLine event just to check weather the actual internet is back or not (it can be a simple handshake as well) and once the is successful you can go on doing your regular API calls.
Check if update on reload is switched off and that you are fully offline. I had the same issue then it randomly started working when update on reload was turned off. I think its because it reinstalls the service worker each time you refresh the page so you're not in the state where the service worker is listening for the sync. Thats my theory to it anyway. Hope this helps...

Client side certificate javascript request

We're developing a react app with a python flask backend. Normally it all works fine, but when placing it behind a server with client side certificate requirement it almost works. It works fine in Chrome, not in Firefox.
The certificate is sent when entering the URL in the browser, it's not sent when making request from react.
The main request finishes fine, the page is displayed.
When loading the page makes a request to the backend, /backend/version.
That request fails, with nginx saying
<head><title>400 No required SSL certificate was sent</title></head>
<body bgcolor="white">
<center><h1>400 Bad Request</h1></center>
<center>No required SSL certificate was sent</center>
When I open devtools and paste the same url, it works fine. The client side certificate is sent by the browser.
How we make the request:
const fetchVersion = () => (dispatch, getState) => {
return dispatch({
endpoint: `${API_ROOT}/version`,
method: 'GET',
headers: {
"Authorization": authHeader(),
payload: (action, state, res) => {
const contentType = res.headers.get('Content-Type');
if (contentType && ~contentType.indexOf('json')) {
return res.json().then(json => json.response);
meta: (action, state, res) => checkIfInvalidToken(action, state, res, dispatch),
What's missing? Why doesn't Firefox attach the certificate to the request like Chrome does?
You might try to see if the problem is resolved by explicitly specify [CALL_API].credentials value to include
According to the documentation
the default value is omit but firefox need include always send cookies, even for cross-origin calls.
Regarding the example in your question, the code could become something like:
endpoint: `${API_ROOT}/version`,
credentials: 'include',
method: 'GET',
headers: {
"Authorization": authHeader(),
...and so on
In a laboratory with purely experimental purpose I think I have reproduced a similar behavior you reported both with Chrome and Firefox and in this lab the credentials: 'include' solves the problem: video available here.