When using success: function () and .done (function ()) in asynchronous requests?

16

In a simple way, I can write an asynchronous request like:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET',
    success: function (_user){
        alert (_user)
    }
});

that alerts me to the _user return. Also, I write:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET'
}).done(function(_user){
    alert(_user);
});

that warns precisely the same result. I know that success: function() in the first case is a callback , executed on success of the request, just like fail: function() and always: function() are executed as the names say.

Speaking of names, it is possible to infer that when the request is ready ( done ), the chainable .done() method is executed. I also understand that the $.ajax method returns a deferred object , which is exactly where .done() acts , so the difference in syntax between it and success .

Anyway, despite my minimalist example, in the vast majority of cases I do not understand when to use each of the ways - be it a callback or the linkage of yet another method - since I I get the expected result in both. I still know a third way to get this result, but it involves the async: false directive, which ends up escaping the scope of the question.

So when do you use .done() ? When to use callback success: function() ? I would like practical examples and, if possible, examples whose result is different when one or another approach is performed.

    
asked by anonymous 27.07.2016 / 19:39

3 answers

11

TL; DR: .done() is the modern way of using success , and it fulfills more or less (*) the specifications of a Promise. Which means it can be chained to the jQuery style and protects execution in case of errors.

Before the concept of Promises and deferred callbacks emerge the abitual method was to pass an object to the ajax method with the necessary settings. So the callback went into that object as well as the future code that depended on that response, which had to start from within the callback:

$.ajax({
    url: url,
    dataType: 'json',
    type: 'GET',
    success: function (_user){
        fazerUpdateTabela(_user);
        guardarUmCookie(_user);
        procurarNoutraAPIqqCoisa(_user.nif, function(dadosPessoais){
            verificarCartao(dadosPessoais.nrCartao, function(ver){
                if (!ver) fazerTudoDeNovo();
                // etc
            });
        });
        // etc
    }
});

In the case of eg procurarNoutraAPIqqCoisa there was another step after it has completed the action chain becomes fragmented and after a few lines it is difficult to know the source and direction of executing the code.

Later, with the concept of Promises it is possible to write code that is a skeleton of what will happen and has a more visual and easy to perceive way. The above example could be adapted (with some internal settings in the functions) for this:

var ajax = $.ajax({
    url: url,
    dataType: 'json',
    type: 'GET'
});
ajax
    .done([guardarUmCookie, fazerUpdateTabela])
    .done(procurarNoutraAPIqqCoisa.then(verificarCartao).fail(fazerTudoDeNovo));

* - jQuery has had problems with jQuery deferreds threads. There is a bug / pr with a long discussion about it on Github . It seems that in version 3 this will be solved but in my view too late because browsers already allow a native version of the same idea.

The modern version is therefore more versatile and allows as I mentioned the chain of functions that should run when the response of the server arrives. It allows functions as argument but also function arrays, in the style of Promise.all , which can be practical.

This version has an API parallel to the promises to forward these strings of .done() and .fail( 'with several methods , giving you more flexibility to manage application flow.

Another important aspect of the Promises is that the errors generated within Promises do not wreak havoc like they used to. A "throw" within a Promise causes it and the following in the chain to be rejected and calls the .catch() of the string. This is very useful for avoiding code that stops working.

After what I wrote above and returning to the question: Which one to use?

I'd rather use Promises and Ajax native. Nowadays this is already possible.

So the "normal" version with callbacks could be:

function _ajax(method, url, done) {
  var xhr = new XMLHttpRequest();
  xhr.open(method, url);
  xhr.onload = function () {
    done(null, xhr.response);
  };
  xhr.onerror = function () {
    done(xhr.response);
  };
  xhr.send();
}

// e para usar
_ajax('GET', 'http://example.com', function (err, dados) {
  if (err) { console.log(err); }
  else console.log('A resposta é:', dados);
});

The version with Promises could be thus an encapsulation of this old version, with powers Promise:

function ajax(method, url) {
    return new Promise(function(resolve, reject) {
        _ajax(method, url, function(err, res) {
            if (err) reject(err);
            else resolve(res);
        });
    });
}

or redo:

function _ajax(method, url) {
    return new Promise(function (resolve, reject) {
        var xhr = new XMLHttpRequest();
        xhr.open(method, url);
        xhr.onload = resolve;
        xhr.onerror = reject;
        xhr.send();
    });
}

and then use with:

ajax('GET', 'http://sopt.moon')
    .then(function(dados) {
        console.log(dados);
    }).catch(function(err) {
        console.error('Oh não!!', err.statusText);
    });

With some more adaptations you can allow POST , PUT, etc. There is a complete polyfill in MDN for this.     

28.07.2016 / 00:29
12

The practical results are the same. The difference is mostly in the style of the code.

The promises (in jQuery, implemented such as deferred objects ) are a widely used model for dealing with asynchronous operations in JavaScript and other languages. They facilitate various treatments for asynchronous operations, especially when you need to deal with the result of more than one operation, either in sequence or in parallel.

Consider, for example, a chain of asynchronous operations where each operation depends on the result of the previous one, with the callback of the first operation starting the second, and so on. This quickly becomes a "hell of callbacks" ( callback hell ). The typical consequence is that the code is less legible, looking like an arrow to the right:

This is very common to occur with if s as well, but in their case, which are all synchronous, it is simpler to solve, often simply by putting them in sequence instead of nested, or joining multiple conditions in a single if .

Already to solve this problem with asynchronous operations is not so simple. You have to take care of the state of these operations, which changes over time, and create a mechanism to record what to do depending on the state.

And that's basically what the promises do, letting you turn the figure code into something like this:

a().then(b).then(c).then(d).then(e).then(f);

Another example, this time with jQuery, executing a callback only when 3 ajax requests have been completed:

$.when(reqA, reqB, reqC).then(callback);
Of course you can create your own solutions to solve the problems of readability versus asynchronicity, but the promises already do this and in a standardized way - that is, what independent code modules are able to understand. Keep in mind that asynchronous operations are not limited to HTTP requests. In jQuery, for example, you can also get promises from other types, such as completed animations, confirmed modal dialogs, etc. And with these promises you can express more simply the flow of operations that your code performs. In Node.js (hence on the server), hell callback is even easier to achieve, whereas essential operations such as database access or file system are asynchronous.

    
28.07.2016 / 00:28
8

This has always been a great paradigm for me as well. I've already researched and will tell you what I could absorb.

In the beginning, the callback success was always spoken by $.ajax . But when the implementation of the $.Deferreds , which would be returns with more income came to be used done to positive callback .

Comparison of callback positivo :

Before $ .Deferreds

$.ajax({
  url: 'url.php',
  type: 'POST'
  success: function(data) {
      alert("SUCESSO"); 
  }
});

After $ .Deferreds

$.ajax({
  url: 'url.php',
  type: 'POST'
}).done(function() { 
     alert("SUCESSO"); 
});

From what I could see the main advantage of using $.Deferreds is that you can have a common function for different requests. This can be explained in a very simplistic way:

function ajax_get_somethings(id) {

  return $.ajax({
    url: get_somethings.php,
    type: 'GET',
    data: {id: id},
    dataType: 'json'
  }).always(function() {
    // Sempre de um alerta
  })
  .fail(function() {
    // Caso falhe solicite outro id
  });

}

ajax_get_somethings(1).done(function(data) {
  // pegue algo com esse id e faça determinada tarefa
});

ajax_get_somethings(2).done(function(data) {
  // pegue algo com esse outro id e faça outra tarefa
});

And that does not end, we still have a lot of fans to explore if necessary: link

I hope I have helped.

Question reference: link

    
27.07.2016 / 22:44