I'm working with an application where in a view
I own the product register of a company, or its clients, suppliers, etc. Each view
serves a specific area, however each view
works with various data and data flow, such as adding new items, deleting and updating.
view
has many items, when executing an addition, where it is necessary to reload
of the list, in order to insert the new data with its respective id of the database, there is a "duplicate" of the whole list for a brief moment. Generally speaking, I enter a new client, it does the reload
of the list, applies the whole list to the end of the list that is already in view
to only then remove the old list.
The code I am using has no secret, has $http
simple of POST
and GET
(my backend is controlled by PHP
) as in the example:
ctrl.js
//Chamada automática
function getProduto() {
factProdutos.getProdutos().then(function (res) {
vm.produtos = res;
});
};
//Chamada da view
vm.addProduto = addProduto;
//Function - Produtos
function addProduto(id) {
var data = {id_empresa:id};
$http.post('php/mainFile.php?action=addProduto', data).then(
function(res) { getProduto(); },
function(err) { alert(feedbackError); }
);
};
factory.js
function _getProdutos() {
return $http.get("php/getFile.php?action=getProduto").then(
function(res) { return res.data;},
function(err) {alert(feedbackError);}
);
};
To do the deletion or update is not a problem, because I do the process in the Database without needing to reload the information, AngularJs will take care of doing this in the view. The same thing happens to remove a product from the list, I only use $filter
and I delete the element from the array.
The problem occurs even when doing a new insertion, because I need the id to perform future processes. I've read about using the same $ filter logic for deletion, but instead of removing, add the new die.
But how to identify the new data? Or compare the new list loaded with the list that is currently in my view
? Or is this the best way to do this optimization, or is there a better method?
This is not an error, but an optimization of the data flow.