PL
EN

Miesiąc: czerwiec 2022

Your data, your customer’s data in the wrong hands?



/Your data, your customer’s data in the wrong hands?

The security of personal data and its confidentiality are very important in these times.

While most companies are feverishly reading the provisions of the General Data Protection Regulation (GDPR), as well as developing policies and procedures that take into account new rights and obligations, often beyond the mainstream of preparation is the issue of regular deletion of data that we no longer need or which have become inadequate to the goals pursued.

In most cases, companies and institutions of various types collect a lot of data on their computers. Sometimes these are analyzes and reports on their functioning, other times confidential agreements with contractors or, finally, personal data of employees or clients. Therefore, these are often extremely important and sensitive data, and under no circumstances can they fall into the wrong hands. When there is a need to get rid of a hard drive or even a CD with such information, it is worth choosing to destroy data carriers by professionals. You should be aware that simply deleting data using free programs or even physically destroying hard drives do not bring the expected results. Unfortunately, such data deletion is not permanent, and therefore almost everyone who comes into possession of the carrier will be able to recover it and then use it – so it’s not worth risking, but you should use the help of professionals.

Among the many companies offering data destruction, not many of them meet high security standards and approach the execution of the order with professionalism. We had the opportunity to see how it should look like with the example of SDR-IT. From specially prepared cars, qualified team, through professional software to the headquarters where we found modern systems and machines required for a properly conducted data disposal process.

SDR-IT has been operating on the European market for many years, serving companies, banks and public institutions.

find out more at sdr-it.pl


post->ID ); if ( ! empty( $postcat ) ) { if($postcat[0]->cat_ID == 7) { echo ''; } else { echo ''; } } ?>

Artificial intelligence in cloud computing



/Artificial intelligence in cloud computing

Artificial Intelligence has become one of the hottest topics in the IT world in the last few years. From startups to the largest global companies. Everyone is trying to find new ways to use it to build applications or systems that would work better than previous versions, would enable the creation of things that were previously impossible or too time-consuming.

The very concept of artificial intelligence, the mechanisms that we use to build intelligent systems, is mostly nothing new and has been around for several decades. After all, it is right now that we observe how seriously it can change our world.

There are several reasons. First, in the age of the Internet, we have access to an almost unlimited set of training data that we can use to make our models better and more accurate. However, even if we do not want to use data from the Internet, it may turn out that our own data, which we collect non-stop on our servers, may be fully sufficient. The second reason is access to a sufficiently large computing power – learning artificial intelligence models is not

a trivial activity and requires appropriate server facilities. It turns out that in the era of cloud computing, where we can run 1 or 10,000 servers in a few minutes, computing power is no longer a problem. The third reason for the breakthrough, which is somewhat the result of the previous ones, is the development or improvement of AI algorithms and tools (e.g.TensorFlow, Apache MXNet, Gluon and many more).

So we can see that building applications and systems using AI has become easier, but it is still not trivial. Is it possible, then, to simplify it even more? Yes, thanks to the use of cloud computing. It turns out that apart from the well-known services, such as virtual servers, databases or archiving services, there are also AI services.

Each IT project that will use AI is of course different, although it may turn out that we are able to choose services from the cloud computing portfolio, thanks to which we will be able to implement it faster. Companies that do not have AI competencies, but still want to take advantage of its advantages, can use ready-made AI services that have been developed to solve common problems.


post->ID ); if ( ! empty( $postcat ) ) { if($postcat[0]->cat_ID == 7) { echo ''; } else { echo ''; } } ?>

Used mining GPU. Avoid like the fire?



/Used mining GPU. Avoid like the fire?

Usually, you may want to avoid graphics cards that have been used around the clock to mine cryptocurrency. But that’s not necessarily the case with a great GPU shortage, when the best graphics cards are always unavailable, even at exorbitant prices.

An obvious concern with purchasing a graphics card to use for mining is that performance will degrade significantly and the GPU will crash faster than expected. However, this is generally not the case. In our experience, mining GPUs do not show a significant reduction in performance. Let’s explore some possible reasons and some caveats.

Seasoned GPU miners typically reduce power consumption and overclock their GPU to make the graphics card more efficient, only increasing memory performance. In contrast, the gamer will want to overclock the GPU, which is a more risky endeavor. Miners run their graphics cards 24/7, but this can also help minimize the heating / cooling cycle that is harmful to silicon. But there are certainly other dangers. Heat is a major concern of GPUs. Problems can arise if they have been used for extraction in a very hot environment without adequate airflow.

Graphics cards may come to the market as the value of cryptocurrencies declines. If you are interested in buying such a card, it is worth asking the seller for details such as its working time, the temperature at which the card worked and what thermal parameters it achieved, and whether it was overclocked.


post->ID ); if ( ! empty( $postcat ) ) { if($postcat[0]->cat_ID == 7) { echo ''; } else { echo ''; } } ?>

Programmable computer networks. Evolution of data transfer.



/Programmable computer networks. Evolution of data transfer.

Computer equipment is more often factory-equipped with high-speed 2.5G, 5G and even 10G network cards and wireless WiFi modules 5, 6 and 6E. If you want to use its communication capabilities, you need to prepare the appropriate infrastructure in the company. Efficient management of an extensive network can be troublesome. SDN comes to the rescue.

SDN (Software Defined Network) technology, i.e. software-controlled networks or, as they are often referred to, programmable computer networks, is the concept of LAN / WLAN network management, which consists in separating the physical network infrastructure related to data transmission from the software layer that controls its operation. Thanks to this, there is the option of centrally managing the network without considering its physical structure. In other words, the administrator can control many elements of the infrastructure, such as routers, access points, switches, firewalls, as if it were one device. Data transmission is controlled at the level of the global corporate network, not related to individual devices. The OpenFlow protocol is most often, but not always, used to control such a network. The concept of SDN was created to solve problems with the configuration of a large number of network devices. The static architecture of traditional networks is decentralized and complex, while at present, in the era of virtualization and cloud systems, a company network is required to be much more flexible and easily solve emerging problems. SDN centralizes „network intelligence” in one element, separating the forwarding of network packets (data plane) from the routing process (control plane). The control plane consists of one or more controllers – this can be considered the brain of the SDN network, where all intelligence is concentrated.

In SDN solutions, network administrators gain full insight into its topology at any time, which allows for better and automatic allocation of network traffic, especially during periods of increased data transmission. SDNs help reduce operational costs and capital expenditure.


post->ID ); if ( ! empty( $postcat ) ) { if($postcat[0]->cat_ID == 7) { echo ''; } else { echo ''; } } ?>

AMD EPYC – Does Intel still have a chance?



/AMD EPYC – Does Intel still have a chance?

The AMD EPYC 7643 from the Milan family was noticed in the Geekbench benchmark database. There it showed the power of the Zen 3 microarchitecture-based cores. It turns out that one processor scored better in the test compared to the platform consisting of dual Intel Xeon.

The EPYC 7643 is one of the new AMD Milan series processors. It is a 48-core chip with support for up to 96 threads, which is based on the Zen 3 microarchitecture. It was noticed in the Geekbench benchmark database. Here we have the opportunity to learn about CPU performance.

AMD EPYC 7643 in the test of one core in Geekbench 4 dialed 5850 points. In the case of many cores, it was over 121 thousand. points. We know that the basic clock speed here is 2.3 GHz. However, in the case of increased power, it is a maximum of 3.6 GHz.

The performance of this AMD EPYC chip is really high. For comparison, the dual Intel Xeon Platinum 8276 with 56 cores and supporting 112 threads in the Single Core test sets 4913 points. However, in the Multi-Core it is 112,457 points. So the new processor from the competition is better.


post->ID ); if ( ! empty( $postcat ) ) { if($postcat[0]->cat_ID == 7) { echo ''; } else { echo ''; } } ?>