In a data centric economy, the success of business largely depends on the management of information lifecycle. Companies of all sizes value data as their vital asset. But gathering data in itself is not beneficial until you learn to analyse data and utilize for informed business decision.
A computer which serves the request of any other computer/client over the network (LAN/WAN) is called a Server. In the client/server model, multi user and LAN based application software program running on server awaits and fulfills requests from client programs, which may be running in the same or other computers. Servers are available in different form factors based on the cost, performance & scalability – tower, rack server & blade server. Servers are categorized in terms of their purpose such as
AD Server & Domain Controller
To provide IT services and administer entire IT infrastructure – you need one or more server/appliance to work as Proxy, DNS, DHCP, AD, File, Print, AV, backup, Radius, SSO, FTP server etc.
Microsoft Active Directory Domain Services are the foundation for distributed networks built on Windows Server 2008 operating systems that use domain controllers. Active Directory Domain Services provide secure, structured, hierarchical data storage for objects in a network such as users, computers, printers, and services. Active Directory Domain Services provide support for locating and working with these objects. Using AD, you can customize how your data is organized to meet your companies needs.
The functionalities are:
- Stores data and manages communications between the users and the DC.
- Allows your DC to serve digital certificates, signatures, and public key cryptography.
- Supports LDAP for cross platform domain services, like any Linux computers in your network.
- Provides SSO authentication for multiple web applications in the same session, so users don’t have to keep providing the same credentials.
- Controls information rights and data access policies. Rights Management determines if you can access a folder or send an email.
- Supports RADIUS, that features centralised authentication, authorisation, and accounting management for users who connect and use a network service.
A file server is a central server in a computer network that provides users a central storage place for files, which is accessible to all authorized clients. Due to rapid growth of data, conventional server with internal disks outgrows its storage capacity and demands for new server to be added. A general purpose server, which utilizes a wide range of hardware and software to perform many different tasks – is not suitable for efficient storage solution. Data loss caused by hardware failures, software malfunctions, human errors, virus attacks – exposes your organization to considerable legal and financial risks.
NAS is a purpose built storage server attached directly to the LAN. NAS devices are optimized to serve specific storage needs with hardened OS, integrated hardware and storage management software.
Print Servers allow printers to be connected to a network and be shared amongst several users. Organisations need a server-based print management solution that can monitor print output from any client to devices from many manufacturers.
- Tracking of print jobs from any platform (PC, Mac, Unix)
- Support or network as well locally attached printers.
- Save costs – accurate print accounting
- Enhance security
- Increase flexibility & productivity – improved performance
- Eradicate waste
- Reduce your carbon footprint
- presents a security risk, since printed documents may contain sensitive or confidential information.
- Users can email in their attachment, or email and it is converted to a print job
Application server: is a container on which one can build and expose business logic and processes to client applications through various protocols like HTTP. In some cases it will have internal web server. Application server is heavy in terms of resource usage.
Database Server: refers to the back-end system of a database application using client/server architecture. The database server performs tasks such as data analysis, storage, data manipulation, archiving, and other non-user specific tasks.
Co-locating App & DB server in the same machine is cost effective in small to medium business. By considering CPU and memory requirement, good performance can be achieved from the same server as the time taken to send the DB queries across a network connection is avoided.
Web server: serves web content (HTML and static content) over the HTTP protocol. A web server‘s fundamental job is to accept and fulfill requests from clients for static content from a website (HTML pages, files, images, video, and so on). An application server’s fundamental job is to provide its clients with access to what is commonly called business logic, which generates dynamic content; that is, it’s code that transforms data to provide the specialized functionality offered by a business, service, or application.
In a typical deployment, a website that provides both static and dynamically generated content runs web servers for the static content and application servers to generate content dynamically. A reverse proxy and load balancer sit in front of web servers and web application servers to route traffic to the appropriate server, first based on the type of content requested and then based on the configured load-balancing algorithm.
We help enterprises to choose right sized server for on premise use- based on the various parameters such as – business need, application demand, traffic load, dataflow, scalability, criticality or downtime threshold,reliability, component redundancy, performance etc.
OEMs: Dell, HP, Fujitsu, SuperMicro, Tyron
If you use an online service to send email, edit documents, watch movies or TV, listen to music, play games or store pictures and other files, it is likely that cloud computing is making it all possible behind the scenes.
Cloud computing is the delivery of on-demand computing services – typically over the internet and on pay-as-you-go basis. Rather than owning their own computing infrastructure or data centers, companies can access to anything from applications to storage from a cloud service provider. Cloud computing is becoming the default option for many apps. Software vendors are increasingly offering their applications as services (SaaS) over the internet on subscription model.
For a company with an application that has variable and un-predictable traffic, it may make financial sense to have it hosted in the cloud. Moving to a services model also moves spending from capex to opex, which may be useful for some companies.
Cloud computing is the on-demand availability of computer system resources. Cloud computing can be broken down into three models.
Infrastructure-as-a-Service (IaaS): refers to the fundamental building blocks of computing that can be rented: physical or virtual servers, storage and networking. It is typically used for – test & development of new application, website hosting, storage, backup & recovery,
Platform-as-a-Service (PaaS) includes IaaS and middleware, database management, operating systems, and development tools. PaaS is typically used for development framework, analytics or business intelligence.
Software as a service (SaaS) is a method for delivering software applications over the Internet, on demand and typically on a subscription basis. With SaaS, cloud providers host and manage the software application and underlying infrastructure and handle any maintenance, like software upgrades and security patching. Users connect to the application over the Internet, usually with a web browser on their phone, tablet or PC.
According to researchers IDC, SaaS is – and will remain the dominant cloud computing model. Most widely used SaaS services include- web based email (Gmail, hotmail, Yahoo), office productivity tool like Microsoft Office 365, Zoom and business applications such as CRM, ERP & Document Management.
SaaS makes even sophisticated enterprise applications such as CRM and ERP , affordable for organizations that lack the resources to buy, deploy and manage the required infrastructure and software themselves.
We partner with industry leading Data Centre/cloud service provider for hosting and co-location service to meet your specific requirement.
Zero Client & Server Based Computing
With ever increasing support cost – companies worldwide spend nearly 60-70% of their IT budgets on desktop maintenance. Security, data privacy, manageability, downtime, power and cooling challenges are driving many organizations to look for alternatives to the traditional PC.
In the fast growing Server based computing & Cloud computing age, traditional desktop/laptop systems are getting replaced by Virtual Desktop Infrastucture (VDI) for most of the standard applications. In this environment, Information is processed by the server and accessed by users through a solid state & disk less access device called Virtual desktop or zero client.
Business Benefits of Virtual computing/ desktop virtualisation :
- Easy installation & integration into existing Windows / Linux Network
- Consumes only 5 Watts and hence saving on Power, UPS & AC
- Solid state device with no moving parts- rugged & durable
- Compatible with most of the general applications on Windows & Linux
- MTBF of 10-15 years, avoids the need of frequent upgrades/ replacement
- No local OS & data- reduces deployment , maintenance & security costs
OEMs: Ncomputing, Citrix, Fujitsu
Workstation & Edge Computing
In the areas of digital workspace- system performance, user interface & experience play very important role in employee’s productivity. Based on the diverse working environment, computing power, and other factors, we can categorise computer users broadly in 3 groups – Desk worker, mobile professional and power user.
Desk workers who use standard office productivity suite like MS office, adobe, email, browsing and server based applications like CRM, ERP etc, thin clinet/zero client based VDI solution is very effective and efficient.
Demand for power users who use applications that are CPU intensive, graphic accelerated, high I/Ops need workstation. We focus on Virtual desktop and workstation to create simple and powerful computing environment in organisation.
Modern day computing in specific operation demands lot of audio video, complex computing, multi tasking capabilities that commercial desktop cannot deliver.
Workstations are purpose built for high performance and heavy workloads. They are primarily used by engineers, product designers, media & entertainment professionals, financial analyst, researcher, software developers and anyone that requires data manipulation, Graphic intensive work, 24×7 operation, high resolution video.
Typically classified workstation must have these features:
- Motherboard: specially designed rugged motherboard to run more than one processor with 16 or more memory slots available, space for multiple graphics cards, dual Gigabit LAN ports, Thunderbolt and even legacy ports to support diverse applications.
- CPU:Intel’s Xeon is the preferred processor for workstations. Xeons support features like multiple sockets, ECC memory, cache memory – essential for heavy workloads.
- ECC RAM.Error-correcting code memory makes your system more reliable. It fixes memory errors before they affect your system, preventing crashes and saving your downtime.
- RAID – uses multiple internal hard drives to store and process your data. Depending on the type of system, you can get multiple drives configured with RAID 0,1,5,6 ensuring data integrity in case of any disk failure.
- Optimized GPU. Higher end GPU (Graphics Processing Unit) can actually take over some of the load from the CPU- making everything faster and helping you to achieve very high quality visual effect.
- Redundant PSU:server-grade feature like redundant power supply allows another power supply unit to take over immediately in the event of failure of the main PSU..
Manufacturing engineers and product designers use leading edge softwares such as Ansys, Catia, Creo, Inventor, Keyshot, SolidEdge, SolidWorks etc to push the boundaries of 3D CAD. One size does not fit all, so whether you’re building mechanical 3D models, simulating product performance, or running PLM software- maximizing your toolset requires professional grade workstations custom-configured for your workflow.
In the field of scientific research, innovative technologies can enable miracles—but only when running on reliable systems. Increasing complexity in workflows is driving demand for data scientists to transform massive amounts of data into insights and create amazing customer experiences. Workstations support the Medical & Science industry by empowering professionals in
- Radiology centers
- Cardiology centers
- Pharmaceutical research and development
- Clinical research organizations
Creating visual effects, animation, simulation, video editing, video streaming and broadcast for Media & Entertainment industry demands hardware solutions purpose-built for professional software applications such as 3ds Max, Maya, Cinema 4D, Adobe creative, Arnold, V-Ray, and DaVinci Resolve.
Solution design for workstation cannot be completed without integrating with high definition monitor to achieve best user interface.
OEMs: HP, Dell, Fujitsu
Data Lifecycle Management (DLM)
While growth of a business is a positive sign of success, it also means there are more to protect and more to lose. As your company grows, your data increases in volume, and a flexible and forward-thinking approach is critical. Data loss and un-availability can cause serious problems for your business, potentially damage your company’s reputation, and drive customers to your competitors.
The success of business in new economy largely depends on the management of data lifecycle. Data life cycle management (DLM) is a policy-based approach to managing the flow of data throughout its life cycle: from creation to deletion. Each industry sector has its own stipulations for data retention and implementing a sound DLM strategy helps businesses remain compliant.
Data capture & Storage Consolidation
For the ages, raw data is captured and stored in hard disk. But today, choices are many and hence understanding of related technology and functions of different types of hard disks is essential to choose and design the appropriate solution to meet specific requirement.
SATA ( Serial Attached ATA) drives support data transfer rate of 300 and 600 MB/sec and runs at 7.2K rpm speed. This type of hard disk is cost effective and suitable for desktop and non critical applications.
SAS (Serial Attached SCSI) drives support higher transfer rate (6 and 12GB/sec) and runs at 10K and 15K rpm. It has very high MTBF, lesser latency and suitable for server, SAN and critical applications with 24/7 operation.
The IOPS value indicates how many different input (write) or output(read) operations a device or array of devices can perform in one second. It may be used along with other metrics, such as latency and throughput, to measure overall performance.
The mechanical nature of traditional HDDs makes them susceptible to fragmentation and physical damage. SSD or Solid State Drive technically refers to any storage device without moving parts. Based on the performance requirement and cost benefits, we can choose SATA SSD or SAS SSD for your specific requirement.
Flash storage is any type of drive or system that uses flash memory to keep data for an extended period of time. The size and complexity of flash-based storage varies in devices ranging from portable USB drives, smartphones, embedded systems to enterprise-class all-flash arrays (AFAs). Flash is SSD, but not all SSD is Flash.
A Server or storage system (JBOD, NAS or SAN) that includes multiple disks, is to be managed for data protection and performance. RAID (Redundant array of independent disks) is a storage technology that balances data protection, system performance, and storage space by determining how the storage system distributes data.
Each RAID level offers a trade-off of data protection, system performance, and storage space. For example, one RAID level might improve data protection but reduce storage space. Another RAID level might increase storage space but also reduce system performance.
Due to rapid growth of data, conventional server with internal disks outgrows its storage capacity and demands for new server to be added. A general purpose server, which utilizes a wide range of hardware and software to perform many different tasks – is not suitable for optimum utilisation and efficient storage performance. Hence, storage has become a sub-system of the enterprise computing.
For the ease of management and enhanced efficiency, data across conventional and virtual server environments is consolidated using DAS, NAS, IP-SAN and FC-SAN solutions, with a wide choice of disk and I/O interfaces (SAS, iSCSI, FC).
Direct Attached Storage(DAS): consists of one or multiple disk drives, attached directly to a server. Data is typically transferred using SCSI commands. For short term expansion of storage capacity, DAS or JBOD(Just Bunch of Disk) is the choice of mass.
Network Attached Storage (NAS) is a file-based storage architecture attached directly to the LAN. NAS devices are optimized to serve specific storage needs with their own operating system and integrated hardware and software. NAS devices are well suited for Web caching , audio-video streaming, backup, and data storage with file serving.
Storage Area Network (SAN)– is a high-speed sub-network connecting storage devices and servers to provide consolidated storage and storage management. The equipments connected in a SAN communicate via Fibre Channel or iSCSI protocol.
Advanced SAN features increase server performance, optimize storage use, and enhance high availability features such as fail-over, load balancing, and distributed applications. SANs are beneficial for data warehousing, database, and OLTP (On-Line Transaction Process) applications.
IP-SANs are conceptually similar to traditional FC SANs, but are built on the iSCSI protocol, which enables the transmission of data across existing TCP/IP networks using the familiar SCSI protocol.
Data Backup & Restore
Backup is not a luxury, It’s a necessity.
Data entered into the system are stored in primary storage like internal hard disk, NAS or SAN device. To move the data to the next stage, we have to use secondary storage system to ensure data integrity & system redundancy.
In general, organizations use tape drive or external hard disk to backup their server data which run critical business applications. Ever increasing data volume and changing Tape drive technologies – have made the task difficult. With the reducing cost of hard disk and advancement in storage technologies , D2D or Disk based backup has become popular and affordable.
Analysts estimate that over 60% of company’s data resides on desktop and laptop computers, but only less than 30% of users regularly backup their data. Data loss caused by hardware failures, software malfunctions, human errors, virus attacks – exposes your organization to considerable legal and financial risks. Many organizations rely on their users to manually copy their valuable information to a USB drive, CD and file server.
By implementing Automated desktop backup, we address this daunting challenge where the data from onsite, branch offices and remote users will be backed up in a central device as per pre-configured backup policy and user based security access.
Organizations are looking to refresh their storage architectures and get a flexible & comprehensive solution for:
- a storage system that is capable of handling File & block level data
- cross-platform centralized storage depository
- backup& restore of desktop & server data
- simple & user friendly GUI based storage management
- reliable, scalable & vendor independent
Unified storage platform uses the NAS, IP-SAN and storage management technologies to provide – all the benefits of an enterprise-class storage solution- without all the cost & complexity. Unified storage consists of application specific operating system and specialized hardware and software components, designed specifically for efficient system performance.
With the help of unified storage platform and suitable backup software, we can design Continuous Data Protection(CDP) and Business Continuity Planning(BCP) solution in a cost effective manner.
Having a disaster recovery strategy in place enables an organization to maintain or quickly resume mission-critical functions following a disruption. We can help you to maintain a DR site at leading datacenter on OPex model.
Network storage uses BTRFS or Ext4 file systems to create a volume. Ext4 is fast and rock solid, and easily recovered on a desktop machine if things go really bad. Btrfs is a bit slower with writes because of its Copy-on-write nature, but just as fast when it comes to reads.
Data retention & Archival
Third stage of DLM is to categorise & store data which are not accessed frequently and content does not change.
Data archive is a collection of records that have been selected for long-term preservation. Archiving is focused on helping customers reduce storage costs & increase performance, by moving data from higher cost server or primary storage to a lower cost storage tier.
Archiving is especially critical for rich media, government, BFSI, R&D and engineering companies that are experiencing explosive growth of content. For them, archiving is not an “offline” process where data is moved to a tape and stored on a shelf. Instead it is a “online / active” process where data is stored in a repository that can be accessed for repurposing and revenue generation.
File and Email archiving enables you to meet regulatory compliance requirements, helps you to identify and resolve potential business issues such as security breaches, legal risk etc.