Front End Access on server

9

We usually put the Back End on the server and the Front End on the local machine.

Is there a problem (maybe performance) in leaving a copy of the FE on the server for each user and just put the link of each on the local machine?

That is, I would have an instance of the FE on each server:

FE_Joao.accdb
FE_Pedro.accdb
FE_Maria.accdb

And on the machine of each of them, a link:

FE_Joao_atalho
FE_Pedro_atalho
FE_Maria_atalho
    
asked by anonymous 08.01.2014 / 16:19

1 answer

6

TL; DR

In practice it is quite possible, especially if users do not open and close the program several times. On the other hand, it is important to consider the trade-offs and certain restrictions that the application must respect (such as not writing files in the same folder as the prgorama).

My experience with executables on the network

Without knowing in detail how the application works, it is difficult to give an absolute answer. First I will report some bad experiences I had with a similar solution:

In one company I worked for everyone had to use an internal system, which was accessed through the network via a link to the executable. Depending on how much the network was congested, the delay to open the program (relative to local programs) was quite noticeable. On the other hand, after the executable is opened, there is no difference in performance.

Since I used this program every day, I tried to put shortcuts in the Start Menu and the Desktop at first. However, there were often screen locks when trying to access the menu or by minimizing programs to view the Desktop. This was probably because Windows was reading the executable over the network to update the shortcut icon. I say this because often the other icons were normal except for that one, and the screen was locked until it appeared. In addition to the icon, Windows reads various file information for a variety of reasons. I've had enough of waiting on times when I was browsing through directories on the network, and it took Windows several minutes to list a few files.

After that, I removed the shortcuts and set up the Task Scheduler to open the program at a scheduled time every day. However, I had some security-related issues sometimes. When we accessed the directory on the server via Windows Explorer, it sometimes asked us to re-enter the required credentials. In the case of Task Scheduler, if he can not access the program on the network, he simply did not ask for the credentials and failed. Another problem is that sometimes Windows does not "trust" the executable, and instead of asking if you really want to open the program, it simply crashes silently.

Anyway, I know they are isolated cases, but it is important to consider that there are some limitations to this solution.

General considerations

Now, I'll try to consider some general points:

Performance

Using executables on your network will cause a delay on startup and every time you need to read a file. As some programs make use of configuration files, each time it is necessary to read a file the delay will repeat. On the other hand, after the readings there should be no impact on performance.

Availability

Assuming there is no network problem and the database server is elsewhere, the system becomes unavailable when files are not accessible for some reason. It's an added concern as you have added an additional node on the network to manage.

Distribution

Distributing new versions of the application will be made easier, although other forms of updating may be possible on a network, such as an administrative script or even an auto-update tool.

On the other hand, users will not be able to choose whether or not they want to upgrade their systems. A new bug or an improperly published version will affect all of them, always.

Competition

If the system writes files to its own directory, it will have to be changed to write to a fixed local directory on the user station's disk or in the registry. Otherwise, each user will overwrite the others' files.

    
09.01.2014 / 12:49