Friday, 24 May 2019

Proper way to install nvidia 390 fix error

Proper way to install nvidia 390


if you see any error in the process look below;

command
 sudo apt purge --autoremove '*nvidia*'

Package 'nvidia-settings' is not installed, so not removed
Package 'nvidia-utils-396' is not installed, so not removed
Package 'nvidia-utils-410' is not installed, so not removed
Package 'nvidia-utils-415' is not installed, so not removed
Package 'nvidia-utils-418' is not installed, so not removed
Package 'nvidia-utils-430' is not installed, so not removed
Package 'xserver-xorg-video-nvidia-396' is not installed, so not removed
Package 'xserver-xorg-video-nvidia-410' is not installed, so not removed
Package 'xserver-xorg-video-nvidia-415' is not installed, so not removed
Package 'xserver-xorg-video-nvidia-418' is not installed, so not removed
Package 'xserver-xorg-video-nvidia-430' is not installed, so not removed
You might want to run 'apt --fix-broken install' to correct these.
The following packages have unmet dependencies:
 libnvidia-decode-390:i386 : Depends: libnvidia-compute-390:i386 (= 390.116-0ubuntu0.18.04.1) but it is not going to be installed
 libnvidia-gl-390:i386 : Depends: libnvidia-common-390:i386
E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution).



Solution # for FILE in $(dpkg-divert --list | grep nvidia-340 | awk '{print $3}'); do dpkg-divert --remove $FILE; done


add this into your ppa
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt update
sudo apt install nvidia-390
This worked for Ubuntu 18.04 LTS.



Sunday, 19 May 2019

Top Data Analytics Tool 2019

List of All Analytics Tools:-

1. Knime

KNIME Analytics Platform is the leading open solution for data-driven innovation, helping you discover the potential hidden in your data, mine for fresh insights, or predict new futures.
With more than 1000 modules, hundreds of ready-to-run examples, a comprehensive range of integrated tools, and the widest choice of advanced algorithms available, KNIME Analytics Platform is the perfect toolbox for any data scientist.

2.RapidMiner

Much like KNIME, RapidMiner operates through visual programming and is capable of manipulating, analyzing and modeling data. RapidMiner makes data science teams more productive through an open source platform for data prep, machine learning, and model deployment. Its unified data science platform accelerates the building of complete analytical workflows – from data prep to machine learning to model validation to deployment – in a single environment, dramatically improving efficiency and shortening the time to value for data science projects.

This tool also include Fraud detection via analyzing financial data and other kind of data in any format.

3. R-Programming

What if I tell you that Project R, a GNU project, is written in R itself? It’s primarily written in C and Fortran. And a lot of its modules are written in R itself. It’s a free software programming language and software environment for statistical computing and graphics. The R language is widely used among data miners for developing statistical software and data analysis. Ease of use and extensibility has raised R’s popularity substantially in recent years.
Besides data mining it provides statistical and graphical techniques, including linear and nonlinear modeling, classical statistical tests, time-series analysis, classification, clustering, and others.

4. Orange

Orange is open source data visualization and data analysis for novice and expert, and provides interactive workflows with a large toolbox to create interactive workflows to analyse and visualize data. Orange is packed with different visualizations, from scatter plots, bar charts, trees, to dendrograms, networks and heat maps.

5.  OpenRefine

OpenRefine (formerly Google Refine) is a powerful tool for working with messy data: cleaning it, transforming it from one format into another, and extending it with web services and external data. OpenRefine can help you explore large data sets with ease.

 

6. Pentaho

Pentaho addresses the barriers that block your organization’s ability to get value from all your data. The platform simplifies preparing and blending any data and includes a spectrum of tools to easily analyze, visualize, explore, report and predict. Open, embeddable and extensible, Pentaho is architected to ensure that each member of your team — from developers to business users — can easily translate data into value.

7. Talend

Talend is the leading open source integration software provider to data-driven enterprises. Our customers connect anywhere, at any speed. From ground to cloud and batch to streaming, data or application integration, Talend connects at big data scale, 5x faster and at 1/5th the cost.

8. Weka

Weka, an open source software, is a collection of machine learning algorithms for data mining tasks. The algorithms can either be applied directly to a data set or called from your own JAVA code. It is also well suited for developing new machine learning schemes, since it was fully implemented in the JAVA programming language, plus supporting several standard data mining tasks.

For someone who hasn’t coded for a while, Weka with its GUI provides easiest transition into the world of Data Science. Being written in Java, those with Java experience can call the library into their code as well.

9. NodeXL

NodeXL is a data visualization and analysis software of relationships and networks. NodeXL provides exact calculations. It is a free (not the pro one) and open-source network analysis and visualization software. It is one of the best statistical tools for data analysis which includes advanced network metrics, access to social media network data importers, and automation.

10. Gephi

Gephi is also an open-source network analysis and visualization software package written in Java on the NetBeans platform. Think of the giant friendship maps you see that represent linkedin or Facebook connections. Gelphi takes that a step further by providing exact calculations.

 

11.Apache Spark

The University of California, Berkeley’s AMP Lab, developed Apache in 2009. Apache Spark is a fast large-scale data processing engine and executes applications in Hadoop clusters 100 times faster in memory and 10 times faster on disk. Spark is built on data science and its concept makes data science effortless. Spark is also popular for data pipelines and machine learning models development.
Spark also includes a library – MLlib, that provides a progressive set of machine algorithms for repetitive data science techniques like Classification, Regression, Collaborative Filtering, Clustering, etc.

 


12. Datawrapper

Datawrapper is an online data-visualization tool for making interactive charts. Once you upload the data from CSV/PDF/Excel file or paste it directly into the field, Datawrapper will generate a bar, line, map or any other related visualization. Datawrapper graphs can be embedded into any website or CMS with ready-to-use embed codes. So many reporters and news organizations use Datawrapper to embed live charts into their articles. It is very easy to use and produces effective graphics.

13. Solver

Solver specializes in providing world-class financial reporting, budgeting and analysis with push-button access to all data sources that drive company-wide profitability. Solver provides BI360, which is available for cloud and on-premise deployment, focusing on four key analytics areas.

14. Qlik

Qlik lets you create visualizations, dashboards, and apps that answer your company’s most important questions. Now you can see the whole story that lives within your data.

15. Tableau Public

Tableau democratizes visualization in an elegantly simple and intuitive tool. It is exceptionally powerful in business because it communicates insights through data visualization. In the analytics process, Tableau’s visuals allow you to quickly investigate a hypothesis, sanity check your gut, and just go explore the data before embarking on a treacherous statistical journey.

16. Google Fusion Tables

Fusion TablesMeet Google Spreadsheets cooler, larger, and much nerdier cousin. Google Fusion tables is an incredible tool for data analysis, large data-set visualization, and mapping. Not surprisingly, Google’s incredible mapping software plays a big role in pushing this tool onto the list. Take for instance this map, which I made to look at oil production platforms in the Gulf of Mexico.

17. Infogram

Infogram offers over 35 interactive charts and more than 500 maps to help you visualize your data beautifully. Create a variety of charts including column, bar, pie, or word cloud. You can even add a map to your infographic or report to really impress your audience.

Sentiment Tools

18. Opentext

The OpenText Sentiment Analysis module is a specialized classification engine used to identify and evaluate subjective patterns and expressions of sentiment within textual content. The analysis is performed at the topic, sentence, and document level and is configured to recognize whether portions of text are factual or subjective and, in the latter case, if the opinion expressed within these pieces of content are positive, negative, mixed, or neutral.

19. Semantria

Semantria is a tool that offers a unique service approach by gathering texts, tweets, and other comments from clients and analyzing them meticulously to derive actionable and highly valuable insights. Semantria offers text analysis via API and Excel plugin. It differs from Lexalytics in that it is offered via API and Excel plugin, and in that it incorporates a bigger knowledge base and uses deep learning.

20. Trackur

Trackur’s automated sentiment analysis looks at the specific keyword you are monitoring and then determines if the sentiment towards that keyword is positive, negative or neutral with the document. That’s weighted the most in Trackur algorithm. It could use to monitor all social media and mainstream news, to gain executive insights through trends, keyword discovery, automated sentiment analysis and influence scoring.

21. SAS Sentiment Analysis

SAS sentiment analysis automatically extracts sentiments in real time or over a period of time with a unique combination of statistical modeling and rule-based natural language processing techniques. Built-in reports show patterns and detailed reactions. So you can hone in on the sentiments that are expressed.
With ongoing evaluations, you can refine models and adjust classifications to reflect emerging topics and new terms relevant to your customers, organization or industry.

 

Friday, 28 December 2018

AI Different Scenarios where we can apply algorithms




1.) Naive Bayes Classifier Algorithm
If we’re planning to automatically classify web pages, forum posts, blog snippets and tweets without manually going through them, then the Naive Bayes Classifier Algorithm will make our life easier.
This classifies words based on the popular Bayes Theorem of probability and is used in applications related to disease prediction, document classification, spam filters and sentiment analysis projects.
We can use the Naive Bayes Classifier Algorithm for ranking pages, indexing relevancy scores and classifying data categorically.

2.) K-Means Clustering Algorithm

K-Means Clustering Algorithm is frequently used in applications such as grouping images into different categories, detecting different activity types in motion sensors and for monitoring whether tracked data points change between different groups over time.  There are business use cases of this algorithm as well such as segmenting data by purchase history, classifying persons based on different interests, grouping inventories by manufacturing and sales metrics, etc.

The K-Means Clustering Algorithm is an unsupervised Machine Learning Algorithm that is used in cluster analysis. It works by categorizing unstructured data into a number of different groups ‘k’ being the number of groups. Each dataset contains a collection of features and the algorithm classifies unstructured data and categorizes them based on specific features.

 3.) Support Vector Machine (SVM) Learning Algorithm
Support Vector Machine Learning Algorithm is used in business applications such as comparing the relative performance of stocks over a period of time. These comparisons are later used to make wiser investment choices.
SVM Algorithm is a supervised learning algorithm, and the way it works is by classifying data sets into different classes through a hyperplane.It marginalizes the classes and maximizes the distances between them to provide unique distinctions. We can use this algorithm for classification tasks that require more accuracy and efficiency of data.

4.) Recommender System Algorithm
The Recommender Algorithm works by filtering and predicting user ratings and preferences for items by using collaborative and content-based techniques. The algorithm filters information and identifies groups with similar tastes to a target user and combines the ratings of that group for making recommendations to that user. It makes global product-based associations and gives personalized recommendations based on a user’s own rating.
For example, if a user likes the TV series ‘The Flash’ and likes the Netflix channel, then the algorithm would recommend shows of a similar genre to the user.

5.1) Linear Regression
Linear Regression widely used for applications such as sales forecasting, risk assessment analysis in health insurance companies and requires minimal tuning.
It is basically used to showcase the relationship between dependent and independent variables and show what happens to the dependent variables when changes are made to independent variables.

5.2)Logistic Regression
Logistic regression is used in applications such as-
1. To Identifying risk factors for diseases and planning preventive measures
2. Classifying words as nouns, pronouns, and verbs
3. Weather forecasting applications for predicting rainfall and weather conditions
4. In voting applications to find out whether voters will vote for a particular candidate or not
A good example of logistic regression is when credit card companies develop models which decide whether a customer will default on their loan EMIs or not.
The best part of logistic regression is that we can include more explanatory (dependent) variables such as dichotomous, ordinal and continuous variables to model binomial outcomes.
Logistic Regression is a statistical analysis technique which is used for predictive analysis. It uses binary classification to reach specific outcomes and models the probabilities of default classes.

6.) Decision Tree Machine Learning Algorithm
Applications of this Decision Tree Machine Learning Algorithm range from data exploration, pattern recognition, option pricing in finances and identifying disease and risk trends.
We want to buy a video game DVD for our best friend’s birthday but aren’t sure whether he will like it or not. We ask the Decision Tree Machine Learning Algorithm, and it will ask we a set of questions related to his preferences such as what console he uses, what is his budget. It’ll also ask whether he likes RPG or first-person shooters, does he like playing single player or multiplayer games, how much time he spends gaming daily and his track record for completing games.
Its model is operational in nature, and depending on our answers, the algorithm will use forward, and backward calculation steps to arrive at different conclusions.

7.) Random Forest ML Algorithm
The random forest algorithm is used in industrial applications such as finding out whether a loan applicant is low-risk or high-risk, predicting the failure of mechanical parts in automobile engines and predicting social media share scores and performance scores.
The Random Forest ML Algorithm is a versatile supervised learning algorithm that’s used for both classification and regression analysis tasks. It creates a forest with a number of trees and makes them random. Although similar to the decision trees algorithm, the key difference is that it runs processes related to finding root nodes and splitting feature nodes randomly.
It essentially takes features and constructs randomly created decision trees to predict outcomes, votes each of them and consider the outcome with the highest votes as the final prediction.

8.) Principal Component Analysis (PCA) Algorithm
PCA algorithm is used in applications such as gene expression analysis, stock market predictions and in pattern classification tasks that ignore class labels.
The Principal Component Analysis (PCA) is a dimensionality reduction algorithm, used for speeding up learning algorithms and can be used for making compelling visualizations of complex datasets. It identifies patterns in data and aims to make correlations of variables in them. Whatever correlations the PCA finds is projected on a similar (but smaller) dimensional subspace.

9.) Artificial Neural Networks
Essentially, deep learning networks are collectively used in a wide variety of applications such as handwriting analysis, colorization of black and white images, computer vision processes and describing or captioning photos based on visual features.
Artificial Neural Network algorithms consist of different layers which analyze data. There are hidden layers which detect patterns in data and the greater the number of layers, the more accurate the outcomes are. Neural networks learn on their own and assign weights to neurons every time their networks process data.
Convolutional Neural Networks and Recurrent Neural Networks are two popular Artificial Neural Network Algorithms.
Convolutional Neural Networks are feed-forward Neural networks which take in fixed inputs and give fixed outputs. For example – image feature classification and video processing tasks.
Recurrent Neural Networks use internal memory and are versatile since they take in arbitrary length sequences and use time-series information for giving outputs. For example – language processing tasks and text and speech analysis

10.) K-Nearest Neighbors Algorithm
KNN algorithm is used in industrial applications in tasks such as when a user wants to look for similar items in comparison to others. It’s even used in handwriting detection applications and image/video recognition tasks.
The best way to advance our understanding of these algorithms is to try our hand in image classification, stock analysis, and similar beginner data science projects.
The K-Nearest Neighbors Algorithm is a lazy algorithm that takes a non-parametric approach to predictive analysis. If we have unstructured data or lack knowledge regarding the distribution data, then the K-Nearest Neighbors Algorithm will come to our rescue. The training phase is pretty fast, and there is a lack of generalization in its training processes. The algorithm works by finding similar examples to our unknown example, and using the properties of those neighboring examples to estimate the properties of our unknown examples.
The only downside is its accuracy can be affected as it is not sensitive to outliers in data points.

Friday, 14 December 2018

Amazon Web Services (AWS) Explained

Q1) What is AWS?

AWS stands for Amazon Web Services. AWS is a platform that provides on-demand resources for hosting web services, storage, networking, databases and other resources over the internet with a pay-as-you-go pricing.

Q2)  What are the components of AWS?

EC2 – Elastic Compute Cloud, S3 – Simple Storage Service, Route53, EBS – Elastic Block Store, Cloudwatch, Key-Paris are few of the components of AWS.

Q3)  What are key-pairs?

Key-pairs are secure login information for your instances/virtual machines. To connect to the instances we use key-pairs that contain a public-key and private-key.

Q4)  What is S3?

S3 stands for Simple Storage Service. It is a storage service that provides an interface that you can use to store any amount of data, at any time, from anywhere in the world. With S3 you pay only for what you use and the payment model is pay-as-you-go.

Q5)  What are the pricing models for EC2instances?

The different pricing model for EC2 instances are as below,

On-demandReservedSpotScheduledDedicatedQ6) What are the types of volumes for EC2 instances?There are two types of volumes,Instance store volumesEBS – Elastic Block StoresQ7) What are EBS volumes?

EBS stands for Elastic Block Stores. They are persistent volumes that you can attach to the instances. With EBS volumes, your data will be preserved even when you stop your instances, unlike your instance store volumes where the data is deleted when you stop the instances.

Q8) What are the types of volumes in EBS?

Following are the types of volumes in EBS,

General purposeProvisioned IOPSMagneticCold HDDThroughput optimizedQ9) What are the different types of instances?

Following are the types of instances,

General purposeComputer OptimizedStorage OptimizedMemory OptimizedAccelerated ComputingQ10) What is an auto-scaling and what are the components?

Auto scaling allows you to automatically scale-up and scale-down the number of instances depending on the CPU utilization or memory utilization. There are 2 components in Auto scaling, they are Auto-scaling groups and Launch Configuration.

Q11) What are reserved instances?

Reserved instances are the instance that you can reserve a fixed capacity of EC2 instances. In reserved instances you will have to get into a contract of 1 year or 3 years.

Q12)What is an AMI?

AMI stands for Amazon Machine Image. AMI is a template that contains the software configurations, launch permission and a block device mapping that specifies the volume to attach to the instance when it is launched.

Q13) What is an EIP?

EIP stands for Elastic IP address. It is designed for dynamic cloud computing. When you want to have a static IP address for your instances when you stop and restart your instances, you will be using EIP address.

Q14) What is Cloudwatch?

Cloudwatch is a monitoring tool that you can use to monitor your various AWS resources. Like health check, network, Application, etc.

Q15) What are the types in cloudwatch?

There are 2 types in cloudwatch. Basic monitoring and detailed monitoring. Basic monitoring is free and detailed monitoring is chargeable.

Q16) What are the cloudwatch metrics that are available for EC2 instances?

Diskreads, Diskwrites, CPU utilization, networkpacketsIn, networkpacketsOut, networkIn, networkOut, CPUCreditUsage, CPUCreditBalance.

Q17) What is the minimum and maximum size of individual objects that you can store in S3

The minimum size of individual objects that you can store in S3 is 0 bytes and the maximum bytes that you can store for individual objects is 5TB.

Q18) What are the different storage classes in S3?

Following are the types of storage classes in S3,

Standard frequently accessedStandard infrequently accessedOne-zone infrequently accessed.GlacierRRS – reduced redundancy storage Q19) What is the default storage class in S3?

The default storage class in S3 in Standard frequently accessed.

 Q20) What is glacier?

Glacier is the back up or archival tool that you use to back up your data in S3.

 Q21) How can you secure the access to your S3 bucket?

There are two ways that you can control the access to your S3 buckets,

ACL – Access Control ListBucket polices Q22) How can you encrypt data in S3?

You can encrypt the data by using the below methods,

Server Side Encryption – S3 (AES 256 encryption)Server Side Encryption – KMS (Key management Service)Server Side Encryption – C (Client Side) Q23) What are the parameters for S3 pricing?

The pricing model for S3 is as below,

Storage usedNumber of requests you makeStorage managementData transferTransfer acceleration Q24) What is the pre-requisite to work with Cross region replication in S3?

You need to enable versioning on both source bucket and destination to work with cross region replication. Also both the source and destination bucket should be in different region.

 Q25) What are roles?

Roles are used to provide permissions to entities that you trust within your AWS account. Roles are users in another account. Roles are similar to users but with roles you do not need to create any username and password to work with the resources.

 Q26) What are policies and what are the types of policies?

Policies are permissions that you can attach to the users that you create. These policies will contain that access that you have provided to the users that you have created. There are 2 types of policies.

Managed policiesInline policies Q27) What is cloudfront?

Cloudfront is an AWS web service that provided businesses and application developers an easy and efficient way to distribute their content with low latency and high data transfer speeds. Cloudfront is content delivery network of AWS.

 Q28) What are edge locations?

Edge location is the place where the contents will be cached. When a user tries to access some content, the content will be searched in the edge location. If it is not available then the content will be made available from the origin location and a copy will be stored in the edge location.

Q29) What is the maximum individual archive that you can store in glacier?

You can store a maximum individual archive of upto 40 TB.

 Q30) What is VPC?

VPC stands for Virtual Private Cloud. VPC allows you to easily customize your networking configuration. VPC is a network that is logically isolated from other network in the cloud. It allows you to have your own IP address range, subnets, internet gateways, NAT gateways and security groups.

 Q31) What is VPC peering connection?

VPC peering connection allows you to connect 1 VPC with another VPC. Instances in these VPC behave as if they are in the same network.

 Q32) What are NAT gateways?

NAT stands for Network Address Translation. NAT gateways enables instances in a private subnet to connect to the internet but prevent the internet from initiating a connection with those instances.

 Q33) How can you control the security to your VPC?

You can use security groups and NACL (Network Access Control List) to control the security to your

VPC.

 Q34) What are the different types of storage gateway?

Following are the types of storage gateway.

File gatewayVolume gatewayTape gateway Q35) What is a snowball?

Snowball is a data transport solution that used source appliances to transfer large amounts of data into and out of AWS. Using snowball, you can move huge amount of data from one place to another which reduces your network costs, long transfer times and also provides better security.

 Q36) What are the database types in RDS?

Following are the types of databases in RDS,

AuroraOracleMYSQL serverPostgresqlMariaDBSQL server Q37) What is a redshift?

Amazon redshift is a data warehouse product. It is a fast and powerful, fully managed, petabyte scale data warehouse service in the cloud.

 Q38) What is SNS?

SNS stands for Simple Notification Service. SNS is a web service that makes it easy to notifications from the cloud. You can set up SNS to receive email notification or message notification.

 Q39) What are the types of routing polices in route53?

Following are the types of routing policies in route53,

Simple routingLatency routingFailover routingGeolocation routingWeighted routingMultivalue answer Q40) What is the maximum size of messages in SQS?

The maximum size of messages in SQS is 256 KB.

Q41) What are the types of queues in SQS?

There are 2 types of queues in SQS.

Standard queueFIFO (First In First Out) Q42) What is multi-AZ RDS?

Multi-AZ (Availability Zone) RDS allows you to have a replica of your production database in another availability zone. Multi-AZ (Availability Zone) database is used for disaster recovery. You will have an exact copy of your database. So when your primary database goes down, your application will automatically failover to the standby database.

Q43) What are the types of backups in RDS database?

There are 2 types of backups in RDS database.

Automated backupsManual backups which are known as snapshots. Q44) What is the difference between security groups and network access control list?Security GroupsNetwork access control listCan control the access at the instance levelCan control access at the subnet levelCan add rules for “allow” onlyCan add rules for both “allow” and “deny”Evaluates all rules before allowing the trafficRules are processed in order number when allowing traffic.Can assign unlimited number of security groupsCan assign upto 5 security groups.Statefull filteringStateless filtering Q45) What are the types of load balancers in EC2?

There are 3 types of load balancers,

Application load balancerNetwork load balancerClassic load balancer Q46) What is and ELB?

ELB stands for Elastic Load balancing. ELB automatically distributes the incoming application traffic or network traffic across multiple targets like EC2, containers, IP addresses.

 Q47) What are the two types of access that you can provide when you are creating users?

Following are the two types of access that you can create.

Programmatic accessConsole access Q48) What are the benefits of auto scaling?

Following are the benefits of auto scaling

Better fault toleranceBetter availabilityBetter cost management Q49) What are security groups?

Security groups acts as a firewall that contains the traffic for one or more instances. You can associate one or more security groups to your instances when you launch then. You can add rules to each security group that allow traffic to and from its associated instances. You can modify the rules of a security group at any time, the new rules are automatically  and immediately applied to all the instances that are associated with the security group

Q50) What are shared AMI’s?

Shared AMI’s are the AMI that are created by other developed and made available for other developed to use.

 Q51)What is the difference between the classic load balancer and application load balancer?

Answer: Dynamic port mapping, multiple port multiple listeners is used in Application Load Balancer, One port one listener is achieved via Classic Load Balancer

 Q52) By default how many Ip address does aws reserve in a subnet?

Answer: 5

 Q53) What is meant by subnet?

Answer: A large section of IP Address divided in to chunks are known as subnets

 Q54) How can you convert a public subnet to private subnet?

Answer: Remove IGW & add NAT Gateway, Associate subnet in Private route table

 Q55) Is it possible to reduce a ebs volume?

Answer: no it’s not possible, we can increase it but not reduce them

 Q56) What is the use of elastic ip are they charged by AWS?

Answer: These are ipv4 address which are used to connect the instance from internet, they are charged if the instances are not attached to it

 Q57) One of my s3 is bucket is deleted but i need to restore is there any possible way?

Answer: If versioning is enabled we can easily restore them

 Q58) When I try to launch an ec2 instance i am getting Service limit exceed, how to fix the issue?

Answer: By default AWS offer service limit of 20 running instances per region, to fix the issue we need to contact AWS support to increase the limit based on the requirement

 Q59) I need to modify the ebs volumes in Linux and windows is it possible

Answer: yes its possible from console use modify volumes in section give the size u need then for windows go to disk management for Linux mount it to achieve the modification

 Q60) Is it possible to stop a RDS instance, how can I do that?

Answer: Yes it’s possible to stop rds. Instance which are non-production and non multi AZ’s

 Q61) What is meant by parameter groups in rds. And what is the use of it?

Answer: Since RDS is a managed service AWS offers a wide set of parameter in RDS as parameter group which is modified as per requirement

 Q62) What is the use of tags and how they are useful?

Answer: Tags are used for identification and grouping AWS Resources

 Q63) I am viewing an AWS Console but unable to launch the instance, I receive an IAM Error how can I rectify it?

Answer: As AWS user I don’t have access to use it, I need to have permissions to use it further

 Q64) I don’t want my AWS Account id to be exposed to users how can I avoid it?

Answer: In IAM console there is option as sign in url where I can rename my own account name with AWS account

 Q65) By default how many Elastic Ip address does AWS Offer?

Answer: 5 elastic ip per region

 Q66) You are enabled sticky session with ELB. What does it do with your instance?

Answer: Binds the user session with a specific instance

 Q67) Which type of load balancer makes routing decisions at either the transport layer or theApplication layer and supports either EC2 or VPC.

Answer: Classic Load Balancer

 Q68) Which is virtual network interface that you can attach to an instance in a VPC?

Answer: Elastic Network Interface

 Q69) You have launched a Linux instance in AWS EC2. While configuring security group, youHave selected SSH, HTTP, HTTPS protocol. Why do we need to select SSH?

Answer: To verify that there is a rule that allows traffic from EC2 Instance to your computer

 Q70) You have chosen a windows instance with Classic and you want to make some change to theSecurity group. How will these changes be effective?

Answer: Changes are automatically applied to windows instances

 Q71) Load Balancer and DNS service comes under which type of cloud service?

Answer: IAAS-Storage

 Q72) You have an EC2 instance that has an unencrypted volume. You want to create another Encrypted volume from this unencrypted volume. Which of the following steps can achieve this?

Answer: Create a snapshot of the unencrypted volume (applying encryption parameters), copy the. Snapshot and create a volume from the copied snapshot

 Q73) Where does the user specify the maximum number of instances with the auto scaling Commands?

Answer: Auto scaling Launch Config

 Q74) Which are the types of AMI provided by AWS?

Answer: Instance Store backed, EBS Backed

 Q75) After configuring ELB, you need to ensure that the user requests are always attached to a Single instance. What setting can you use?

Answer:  Sticky session

 Q76) When do I prefer to Provisioned IOPS over the Standard RDS storage?

Ans: If you have do batch-oriented is workloads.

Q77) If I am running on my DB Instance a Multi-AZ deployments, can I use to the stand by the DB Instance for read or write a operation along with to primary DB instance?

Ans: Primary db instance does not working.

Q78) Which the AWS services will you use to the collect and the process e-commerce data for the near by real-time analysis?

Ans:  Good of Amazon DynamoDB.

Q79) A company is deploying the new two-tier an web application in AWS. The company has to limited on staff and the requires high availability, and the application requires to complex queries and table joins. Which configuration provides to the solution for company’s requirements?

Ans: An web application provide on Amazon DynamoDB solution.

Q80) Which the statement use to cases are suitable for Amazon DynamoDB?

Ans: The storing metadata for the Amazon S3 objects& The Running of relational joins and complex an updates.

Q81) Your application has to the retrieve on data from your user’s mobile take every 5 minutes and then data is stored in the DynamoDB, later every day at the particular time the data is an extracted into S3 on a per user basis and then your application is later on used to visualize the data to user. You are the asked to the optimize the architecture of the backend system can to lower cost, what would you recommend do?

Ans: Introduce Amazon Elasticache to the cache reads from the Amazon DynamoDB table and to reduce the provisioned read throughput.

Q82) You are running to website on EC2 instances can deployed across multiple Availability Zones with an Multi-AZ RDS MySQL Extra Large DB Instance etc. Then site performs a high number of the small reads and the write per second and the relies on the eventual consistency model. After the comprehensive tests you discover to that there is read contention on RDS MySQL. Which is the best approaches to the meet these requirements?

Ans:The Deploy Elasti Cache in-memory cache is  running in each availability zone and Then Increase the RDS MySQL Instance size and the Implement provisioned IOPS.

Q83) An startup is running to a pilot deployment of around 100 sensors to the measure street noise and The air quality is urban areas for the 3 months. It was noted that every month to around the 4GB of sensor data are generated. The company uses to a load balanced take auto scaled layer of the EC2 instances and a RDS database with a 500 GB standard storage. The pilot was success and now they want to the deploy take atleast 100K sensors.let which to need the supported by backend. You need to the stored data for at least 2 years to an analyze it. Which setup of  following would you be prefer?

Ans: The Replace the RDS instance with an 6 node Redshift cluster with take 96TB of storage.

Q84) Let to Suppose you have an application where do you have to render images and also do some of general computing. which service will be best fit your need?

Ans: Used on Application Load Balancer.

Q85) How will change the instance give type for the instances, which are the running in your applications tier and Then using Auto Scaling. Where will you change it from areas?

Ans: Changed to Auto Scaling launch configuration areas.

Q86) You have an content management system running on the Amazon EC2 instance that is the approaching 100% CPU of utilization. Which option will be reduce load on the Amazon EC2 instance?

Ans: Let Create a load balancer, and Give register the Amazon EC2 instance with it.

Q87) What does the Connection of draining do?

Ans: The re-routes traffic from the instances which are to be updated (or) failed an health to check.

Q88) When the instance is an unhealthy, it is do terminated and replaced with an new ones, which of the services does that?

Ans: The survice make a fault tolerance.

Q89) What are the life cycle to hooks used for the AutoScaling?

Ans: They are used to the  put an additional taken wait time to the scale in or scale out events.

Q90) An user has to setup an Auto Scaling group. Due to some issue the group has to failed for launch a single instance for the more than 24 hours. What will be happen to the Auto Scaling in the condition?

Ans: The auto Scaling will be suspend to the scaling process.

Q91) You have an the EC2 Security Group with a several running to EC2 instances. You changed to the Security of Group rules to allow the inbound traffic on a new port and protocol, and then the launched a several new instances in the same of Security Group.Such the new rules apply?

Ans: The Immediately to all the instances in security groups.Q1) What is AWS?

AWS stands for Amazon Web Services. AWS is a platform that provides on-demand resources for hosting web services, storage, networking, databases and other resources over the internet with a pay-as-you-go pricing.

Q2)  What are the components of AWS?

EC2 – Elastic Compute Cloud, S3 – Simple Storage Service, Route53, EBS – Elastic Block Store, Cloudwatch, Key-Paris are few of the components of AWS.

Q3)  What are key-pairs?

Key-pairs are secure login information for your instances/virtual machines. To connect to the instances we use key-pairs that contain a public-key and private-key.

Q4)  What is S3?

S3 stands for Simple Storage Service. It is a storage service that provides an interface that you can use to store any amount of data, at any time, from anywhere in the world. With S3 you pay only for what you use and the payment model is pay-as-you-go.

Q5)  What are the pricing models for EC2instances?

The different pricing model for EC2 instances are as below,

On-demandReservedSpotScheduledDedicatedQ6) What are the types of volumes for EC2 instances?There are two types of volumes,Instance store volumesEBS – Elastic Block StoresQ7) What are EBS volumes?

EBS stands for Elastic Block Stores. They are persistent volumes that you can attach to the instances. With EBS volumes, your data will be preserved even when you stop your instances, unlike your instance store volumes where the data is deleted when you stop the instances.

Q8) What are the types of volumes in EBS?

Following are the types of volumes in EBS,

General purposeProvisioned IOPSMagneticCold HDDThroughput optimizedQ9) What are the different types of instances?

Following are the types of instances,

General purposeComputer OptimizedStorage OptimizedMemory OptimizedAccelerated ComputingQ10) What is an auto-scaling and what are the components?

Auto scaling allows you to automatically scale-up and scale-down the number of instances depending on the CPU utilization or memory utilization. There are 2 components in Auto scaling, they are Auto-scaling groups and Launch Configuration.

Q11) What are reserved instances?

Reserved instances are the instance that you can reserve a fixed capacity of EC2 instances. In reserved instances you will have to get into a contract of 1 year or 3 years.

Q12)What is an AMI?

AMI stands for Amazon Machine Image. AMI is a template that contains the software configurations, launch permission and a block device mapping that specifies the volume to attach to the instance when it is launched.

Q13) What is an EIP?

EIP stands for Elastic IP address. It is designed for dynamic cloud computing. When you want to have a static IP address for your instances when you stop and restart your instances, you will be using EIP address.

Q14) What is Cloudwatch?

Cloudwatch is a monitoring tool that you can use to monitor your various AWS resources. Like health check, network, Application, etc.

Q15) What are the types in cloudwatch?

There are 2 types in cloudwatch. Basic monitoring and detailed monitoring. Basic monitoring is free and detailed monitoring is chargeable.

Q16) What are the cloudwatch metrics that are available for EC2 instances?

Diskreads, Diskwrites, CPU utilization, networkpacketsIn, networkpacketsOut, networkIn, networkOut, CPUCreditUsage, CPUCreditBalance.

Q17) What is the minimum and maximum size of individual objects that you can store in S3

The minimum size of individual objects that you can store in S3 is 0 bytes and the maximum bytes that you can store for individual objects is 5TB.

Q18) What are the different storage classes in S3?

Following are the types of storage classes in S3,

Standard frequently accessedStandard infrequently accessedOne-zone infrequently accessed.GlacierRRS – reduced redundancy storage Q19) What is the default storage class in S3?

The default storage class in S3 in Standard frequently accessed.

 Q20) What is glacier?

Glacier is the back up or archival tool that you use to back up your data in S3.

 Q21) How can you secure the access to your S3 bucket?

There are two ways that you can control the access to your S3 buckets,

ACL – Access Control ListBucket polices Q22) How can you encrypt data in S3?

You can encrypt the data by using the below methods,

Server Side Encryption – S3 (AES 256 encryption)Server Side Encryption – KMS (Key management Service)Server Side Encryption – C (Client Side) Q23) What are the parameters for S3 pricing?

The pricing model for S3 is as below,

Storage usedNumber of requests you makeStorage managementData transferTransfer acceleration Q24) What is the pre-requisite to work with Cross region replication in S3?

You need to enable versioning on both source bucket and destination to work with cross region replication. Also both the source and destination bucket should be in different region.

 Q25) What are roles?

Roles are used to provide permissions to entities that you trust within your AWS account. Roles are users in another account. Roles are similar to users but with roles you do not need to create any username and password to work with the resources.

 Q26) What are policies and what are the types of policies?

Policies are permissions that you can attach to the users that you create. These policies will contain that access that you have provided to the users that you have created. There are 2 types of policies.

Managed policiesInline policies Q27) What is cloudfront?

Cloudfront is an AWS web service that provided businesses and application developers an easy and efficient way to distribute their content with low latency and high data transfer speeds. Cloudfront is content delivery network of AWS.

 Q28) What are edge locations?

Edge location is the place where the contents will be cached. When a user tries to access some content, the content will be searched in the edge location. If it is not available then the content will be made available from the origin location and a copy will be stored in the edge location.

Q29) What is the maximum individual archive that you can store in glacier?

You can store a maximum individual archive of upto 40 TB.

 Q30) What is VPC?

VPC stands for Virtual Private Cloud. VPC allows you to easily customize your networking configuration. VPC is a network that is logically isolated from other network in the cloud. It allows you to have your own IP address range, subnets, internet gateways, NAT gateways and security groups.

 Q31) What is VPC peering connection?

VPC peering connection allows you to connect 1 VPC with another VPC. Instances in these VPC behave as if they are in the same network.

 Q32) What are NAT gateways?

NAT stands for Network Address Translation. NAT gateways enables instances in a private subnet to connect to the internet but prevent the internet from initiating a connection with those instances.

 Q33) How can you control the security to your VPC?

You can use security groups and NACL (Network Access Control List) to control the security to your

VPC.

 Q34) What are the different types of storage gateway?

Following are the types of storage gateway.

File gatewayVolume gatewayTape gateway Q35) What is a snowball?

Snowball is a data transport solution that used source appliances to transfer large amounts of data into and out of AWS. Using snowball, you can move huge amount of data from one place to another which reduces your network costs, long transfer times and also provides better security.

 Q36) What are the database types in RDS?

Following are the types of databases in RDS,

AuroraOracleMYSQL serverPostgresqlMariaDBSQL server Q37) What is a redshift?

Amazon redshift is a data warehouse product. It is a fast and powerful, fully managed, petabyte scale data warehouse service in the cloud.

 Q38) What is SNS?

SNS stands for Simple Notification Service. SNS is a web service that makes it easy to notifications from the cloud. You can set up SNS to receive email notification or message notification.

 Q39) What are the types of routing polices in route53?

Following are the types of routing policies in route53,

Simple routingLatency routingFailover routingGeolocation routingWeighted routingMultivalue answer Q40) What is the maximum size of messages in SQS?

The maximum size of messages in SQS is 256 KB.

Q41) What are the types of queues in SQS?

There are 2 types of queues in SQS.

Standard queueFIFO (First In First Out) Q42) What is multi-AZ RDS?

Multi-AZ (Availability Zone) RDS allows you to have a replica of your production database in another availability zone. Multi-AZ (Availability Zone) database is used for disaster recovery. You will have an exact copy of your database. So when your primary database goes down, your application will automatically failover to the standby database.

Q43) What are the types of backups in RDS database?

There are 2 types of backups in RDS database.

Automated backupsManual backups which are known as snapshots. Q44) What is the difference between security groups and network access control list?Security GroupsNetwork access control listCan control the access at the instance levelCan control access at the subnet levelCan add rules for “allow” onlyCan add rules for both “allow” and “deny”Evaluates all rules before allowing the trafficRules are processed in order number when allowing traffic.Can assign unlimited number of security groupsCan assign upto 5 security groups.Statefull filteringStateless filtering Q45) What are the types of load balancers in EC2?

There are 3 types of load balancers,

Application load balancerNetwork load balancerClassic load balancer Q46) What is and ELB?

ELB stands for Elastic Load balancing. ELB automatically distributes the incoming application traffic or network traffic across multiple targets like EC2, containers, IP addresses.

 Q47) What are the two types of access that you can provide when you are creating users?

Following are the two types of access that you can create.

Programmatic accessConsole access Q48) What are the benefits of auto scaling?

Following are the benefits of auto scaling

Better fault toleranceBetter availabilityBetter cost management Q49) What are security groups?

Security groups acts as a firewall that contains the traffic for one or more instances. You can associate one or more security groups to your instances when you launch then. You can add rules to each security group that allow traffic to and from its associated instances. You can modify the rules of a security group at any time, the new rules are automatically  and immediately applied to all the instances that are associated with the security group

Q50) What are shared AMI’s?

Shared AMI’s are the AMI that are created by other developed and made available for other developed to use.

 Q51)What is the difference between the classic load balancer and application load balancer?

Answer: Dynamic port mapping, multiple port multiple listeners is used in Application Load Balancer, One port one listener is achieved via Classic Load Balancer

 Q52) By default how many Ip address does aws reserve in a subnet?

Answer: 5

 Q53) What is meant by subnet?

Answer: A large section of IP Address divided in to chunks are known as subnets

 Q54) How can you convert a public subnet to private subnet?

Answer: Remove IGW & add NAT Gateway, Associate subnet in Private route table

 Q55) Is it possible to reduce a ebs volume?

Answer: no it’s not possible, we can increase it but not reduce them

 Q56) What is the use of elastic ip are they charged by AWS?

Answer: These are ipv4 address which are used to connect the instance from internet, they are charged if the instances are not attached to it

 Q57) One of my s3 is bucket is deleted but i need to restore is there any possible way?

Answer: If versioning is enabled we can easily restore them

 Q58) When I try to launch an ec2 instance i am getting Service limit exceed, how to fix the issue?

Answer: By default AWS offer service limit of 20 running instances per region, to fix the issue we need to contact AWS support to increase the limit based on the requirement

 Q59) I need to modify the ebs volumes in Linux and windows is it possible

Answer: yes its possible from console use modify volumes in section give the size u need then for windows go to disk management for Linux mount it to achieve the modification

 Q60) Is it possible to stop a RDS instance, how can I do that?

Answer: Yes it’s possible to stop rds. Instance which are non-production and non multi AZ’s

 Q61) What is meant by parameter groups in rds. And what is the use of it?

Answer: Since RDS is a managed service AWS offers a wide set of parameter in RDS as parameter group which is modified as per requirement

 Q62) What is the use of tags and how they are useful?

Answer: Tags are used for identification and grouping AWS Resources

 Q63) I am viewing an AWS Console but unable to launch the instance, I receive an IAM Error how can I rectify it?

Answer: As AWS user I don’t have access to use it, I need to have permissions to use it further

 Q64) I don’t want my AWS Account id to be exposed to users how can I avoid it?

Answer: In IAM console there is option as sign in url where I can rename my own account name with AWS account

 Q65) By default how many Elastic Ip address does AWS Offer?

Answer: 5 elastic ip per region

 Q66) You are enabled sticky session with ELB. What does it do with your instance?

Answer: Binds the user session with a specific instance

 Q67) Which type of load balancer makes routing decisions at either the transport layer or theApplication layer and supports either EC2 or VPC.

Answer: Classic Load Balancer

 Q68) Which is virtual network interface that you can attach to an instance in a VPC?

Answer: Elastic Network Interface

 Q69) You have launched a Linux instance in AWS EC2. While configuring security group, youHave selected SSH, HTTP, HTTPS protocol. Why do we need to select SSH?

Answer: To verify that there is a rule that allows traffic from EC2 Instance to your computer

 Q70) You have chosen a windows instance with Classic and you want to make some change to theSecurity group. How will these changes be effective?

Answer: Changes are automatically applied to windows instances

 Q71) Load Balancer and DNS service comes under which type of cloud service?

Answer: IAAS-Storage

 Q72) You have an EC2 instance that has an unencrypted volume. You want to create another Encrypted volume from this unencrypted volume. Which of the following steps can achieve this?

Answer: Create a snapshot of the unencrypted volume (applying encryption parameters), copy the. Snapshot and create a volume from the copied snapshot

 Q73) Where does the user specify the maximum number of instances with the auto scaling Commands?

Answer: Auto scaling Launch Config

 Q74) Which are the types of AMI provided by AWS?

Answer: Instance Store backed, EBS Backed

 Q75) After configuring ELB, you need to ensure that the user requests are always attached to a Single instance. What setting can you use?

Answer:  Sticky session

 Q76) When do I prefer to Provisioned IOPS over the Standard RDS storage?

Ans: If you have do batch-oriented is workloads.

Q77) If I am running on my DB Instance a Multi-AZ deployments, can I use to the stand by the DB Instance for read or write a operation along with to primary DB instance?

Ans: Primary db instance does not working.

Q78) Which the AWS services will you use to the collect and the process e-commerce data for the near by real-time analysis?

Ans:  Good of Amazon DynamoDB.

Q79) A company is deploying the new two-tier an web application in AWS. The company has to limited on staff and the requires high availability, and the application requires to complex queries and table joins. Which configuration provides to the solution for company’s requirements?

Ans: An web application provide on Amazon DynamoDB solution.

Q80) Which the statement use to cases are suitable for Amazon DynamoDB?

Ans: The storing metadata for the Amazon S3 objects& The Running of relational joins and complex an updates.

Q81) Your application has to the retrieve on data from your user’s mobile take every 5 minutes and then data is stored in the DynamoDB, later every day at the particular time the data is an extracted into S3 on a per user basis and then your application is later on used to visualize the data to user. You are the asked to the optimize the architecture of the backend system can to lower cost, what would you recommend do?

Ans: Introduce Amazon Elasticache to the cache reads from the Amazon DynamoDB table and to reduce the provisioned read throughput.

Q82) You are running to website on EC2 instances can deployed across multiple Availability Zones with an Multi-AZ RDS MySQL Extra Large DB Instance etc. Then site performs a high number of the small reads and the write per second and the relies on the eventual consistency model. After the comprehensive tests you discover to that there is read contention on RDS MySQL. Which is the best approaches to the meet these requirements?

Ans:The Deploy Elasti Cache in-memory cache is  running in each availability zone and Then Increase the RDS MySQL Instance size and the Implement provisioned IOPS.

Q83) An startup is running to a pilot deployment of around 100 sensors to the measure street noise and The air quality is urban areas for the 3 months. It was noted that every month to around the 4GB of sensor data are generated. The company uses to a load balanced take auto scaled layer of the EC2 instances and a RDS database with a 500 GB standard storage. The pilot was success and now they want to the deploy take atleast 100K sensors.let which to need the supported by backend. You need to the stored data for at least 2 years to an analyze it. Which setup of  following would you be prefer?

Ans: The Replace the RDS instance with an 6 node Redshift cluster with take 96TB of storage.

Q84) Let to Suppose you have an application where do you have to render images and also do some of general computing. which service will be best fit your need?

Ans: Used on Application Load Balancer.

Q85) How will change the instance give type for the instances, which are the running in your applications tier and Then using Auto Scaling. Where will you change it from areas?

Ans: Changed to Auto Scaling launch configuration areas.

Q86) You have an content management system running on the Amazon EC2 instance that is the approaching 100% CPU of utilization. Which option will be reduce load on the Amazon EC2 instance?

Ans: Let Create a load balancer, and Give register the Amazon EC2 instance with it.

Q87) What does the Connection of draining do?

Ans: The re-routes traffic from the instances which are to be updated (or) failed an health to check.

Q88) When the instance is an unhealthy, it is do terminated and replaced with an new ones, which of the services does that?

Ans: The survice make a fault tolerance.

Q89) What are the life cycle to hooks used for the AutoScaling?

Ans: They are used to the  put an additional taken wait time to the scale in or scale out events.

Q90) An user has to setup an Auto Scaling group. Due to some issue the group has to failed for launch a single instance for the more than 24 hours. What will be happen to the Auto Scaling in the condition?

Ans: The auto Scaling will be suspend to the scaling process.

Q91) You have an the EC2 Security Group with a several running to EC2 instances. You changed to the Security of Group rules to allow the inbound traffic on a new port and protocol, and then the launched a several new instances in the same of Security Group.Such the new rules apply?

Ans: The Immediately to all the instances in security groups.

Wednesday, 21 November 2018

The best android app for book readers



The best android app for book readers on mobile devices is this one called moon + reader.

It has features that no other app has could theme and ☆ Support epub, pdf, mobi, chm, cbr, cbz, umd, fb2, txt, html, rar, zip or OPDS, key features:

✔ Full visual options: line space, font scale, bold, italic, shadow, justified alignment, alpha colors, fading edge etc.
✔ 10+ themes embedded, includes Day & Night mode switcher.
✔ Various types of paging: touch screen, volume keys or even camera, search or back keys.
✔ 24 customized operations (screen click, swipe gesture, hardware keys), apply to 15 customized events: search, bookmark, themes, navigation, font size and more.
✔ 5 auto-scroll modes: rolling blind mode; by pixel, by line or by page. Real-time speed control.
✔ Adjust the brightness by sliding your finger along the left edge of the screen, gesture commands supported.
✔ Intelligent paragraph; indent paragraph; trim unwanted blank spaces options.
✔ “Keep your eyes health” options for long-time reading.
✔ Real page turning effect with customized speed/color/transparent; 5 page flip animations;
✔ My Bookshelf design: Favorites, Downloads, Authors, Tags; self bookcover, search, import supported.
✔ Justified text alignment, hyphenation mode supported.
✔ Dual page mode for landscape screen.
✔ Support all four screen orientations.
✔ EPUB3 multimedia content support (video and audio)
✔ Backup/Restore options to cloud via DropBox, sync reading positions between phones and tablets.
✔ Highlight, Annotation, Dictionary (Offline or Online, support ColorDict, GoldenDict, Fora, ABBYY Lingvo, etc.), Translation, Share functions all in this ebook reader.

-Localized in 40 languages.

Download link https://play.google.com/store/apps/details?id=com.flyersoft.moonreader

For Apple users Official Books App is still Best.

Wednesday, 31 October 2018

Introduction to Linux

Linux

Open source operating system
Windows and OS X are mainly used on Desktops
Used not only on personal computers, but web servers, electronics
Android is a variation of Linux (they removed some stuff and customized it for mobile)

      Tired of privacy issues in Windows/OS X?
      Learn to manage web servers or build electronics/robots?

Different “versions” of Linux, one of the most popular is Ubuntu

1.    Download Ubuntu
2.    Install VMware Player for Windows 64-bit
a.    Remove all USB drives from computer
3.    If you are installing on other computer, burn ISO to DVD
4.    Create a new virtual machine
a.    Select ISO
                    i.     Ubuntu
                   ii.     bucky
                 iii.     bacon123
b.    Store virtual disk into a single file
c.     Customize Hardware > Memory: 2-4 GB (2048 MB)
                                           i.        too much will take away from your host OS
5.    Finish & Play

Ctrl+Altto switch to main OS



Basic architecture of Linux
simpleDiagram.png

Bottom layer is hardware
      1’s and 0’s

Next layer is the kernel
      Core operating system
      Software that tells hardware directly what to do

Top layer are Applications or user processes
      Programs that you use and make
      Browser, games, Text editor…
      When making programs, you don’t need to tell CPU how to work



Kernel decides which app is allowed to use the CPU at any time
Manages memory for each app
Computer with a single core CPU, it can appear several apps are running at the same time
What’s actually happening is each app uses CPU for small amount of time, process pauses, and then another one does
CPU can switch between processes so fast they appear to be running simultaneously by humans
Kernel manages all of these operations (also for multi-core CPU’s)

Kernel also gives each app it’s own chunk of memory
That way, programs don’t mess with each other when running



Basic Commands

Shell window (or Terminal) is a program where you can type commands

Search computer > Terminal > Open
Right Click > Lock to Launcher

Right click > Profile > Profile Preferences
Cursor shape > I-beam

 

You can also right click and unlock things you don’t want


name@host:path$


      Make a file called Story on the Desktop
      Add some text in it

Display current working directory

   pwd


Display the contents of a file

   cat Desktop/Story

 

List all contents in directory (and detailed list)

-a shows dot files (hidden files) usually used as configuration files

   ls

  ls -l

  ls -a

   ls -la




Navigating

Move directories (parent, child, home)

  cd ..

  cd Other

   cd


Create and delete directories

  mkdir Other

  mkdir Tuna

   rmdir Tuna




Working with Files

Create a file. If it already exists, doesn’t change it

   touch Story

   touch Bacon


Delete (remove) a file

   rm Bacon


Copy a file

   cp Story Story2


Move a file

   mv Story2 Other/Story2


You can also rename a file using

   mv Story Tuna


Display content

   echo hey now




Other Commands

Search - print lines from Story that include the word bacon

   grep bacon Story


See the difference between two files
   diff file1 file2

To change password
   passwd

You can use variables to store info

  NAME=Bucky

   echo $NAME


Clear terminal, all data is still stored

   clear

   echo $NAME


Learn more about a commands

   info echo




Shell Tips
You can use the up arrow to cycle through previous commands

Editors
When working on servers or over a network, you don’t have GUI text editors



You can send the results of a command using

   ls > Crap


Will overwrite it if it already exists

   pwd > Crap


To append

   pwd >> Crap


For a list of programs (processes) running

   ps




Permissions

File permissions

   ls -l


permissions.jpg

r = readable
w = writable
x = executable

First is user permissions (who owns the file)
Second is permissions for group members (more on groups later)
Last are permissions for anyone (global)

To change permissions

chmod options permissions filename


Give group permission to read the file (to remove use - instead of +)

   chmod g+r Tuna


Set permissions for:
      u - user
      g - group members
      o - other people (from the outside world)

chmod u=rwx,g=rx,o=r myfile


Easier format

   chmod 754 filename


Here the digits 7, 5, and 4 each individually represent the permissions for the user, group, and others, in that order.
      4 stands for "read"
      2 stands for "write"
      1 stands for "execute"
      0 stands for "no permission”

7 is the combination of permissions 4+2+1 (read, write, and execute)



Compress files

   gzip Crap


Decompress them

   gunzip Crap.gz


For multiple files

tar cvf archive.tar file1 file2

   tar cvf Sample.tar Story Crap


      c = create mode
      v = displays output in terminal (leave it out if you don’t want to display file names)
      f = file options (argument after this must be file name of tar)

To extract files

   tar xvf Sample.tar


x = extract mode



Linux Directory Overview

      Click “Files” icon on launcher bar
      Choose “Computer” on left

Verify Checksum

      Download file
      Navigate to that directory in Terminal

    algorithm filename

   md5sum ubuntu-11.10-dvd-i386.iso

   sha1sum ubuntu-14.04-server-amd64.iso

      Compare this against the official checksum (look at this from different computers)

   sha256sum ubuntu-14.04-server-amd64.iso | grep c7bf55250ca7a7ad897fd219af6ef3d4768be54fb3e2537abb3da8f7f4ed8913

      red means they are the same



      Make new profiles
      Change background transparency

Compress files

  gzip Crap


Decompress them

  gunzip Crap.gz


For multiple files

tar cvf archive.tar file1 file2

   tar cvf Sample.tar Story Crap


      c = create mode
      v = displays output in terminal (leave it out if you don’t want to display file names)
      f = file options (argument after this must be file name of tar)

To extract files

  tar xvf Sample.tar


x = extract mode



Installing Stuff

apt - Advanced Package Tool (Linux built in software installer)

sudo apt-get update

java -version

sudo apt-get install default-jre


When you download something

      Download from internet
      Extract file

# Navigate to bin and run shell script

cd

cd pycharm-xxx/bin

bash pycharm.sh




Users

      A user is anyone who uses the computer
      Most users have restricted access to what they can do
      This is usually a good thing, keeps them from deleting important system files
      There is always a super user named “root” that can do anything
      Regular users are sometimes able to perform commands they usually couldn’t by running the command as the super user

    sudo cat shadow


Note: Even if you are the owner of a server, you usually log in as a non-super user. This help to prevent you from accidentally doing anything bad. 

Every file on Linux is owned by a User and a Group

cd

cd Desktop

ls -la


permissions | owned by user | group


# To add a new user

sudo useradd mom

sudo passwd mom




Groups

By default, whenever a user is created they belong to a group with the same name

# Create a new group

sudo groupadd girls

sudo usermod -a -G girls mom


-g will edit their primary group
-a -G just adds (appends) them to another group (keeps their primary)

To delete a user

sudo userdel mom




Linux Directory Overview

# Go to home

cd


# Go up into the root directory

cd ../..

 

ls

 

# Most of your configuration files are in here

cd etc

cat passwd


These are all the users for the system

login name | password | user ID | group ID | real name | home directory | shell path


Password symbols
x  - password stored in separate shadow folder (they are encrypted, never stored in clear text)
*  - can’t log in
::  -  no password required to log in

# won’t work

cat shadow


sudo cat shadow

cat passwd


      If you ever want to add a new user manually, just add a new line (or delete to remove user) from this file
      Need to be super user (sudo) to edit
      There are easier ways to do this, and you can just add users from the command line, but this is what happens behind the scenes



Groups

Groups are a handy way you can set permissions for a bunch of users at once

cat group


group name | password | group ID | additional members




SSH

      If you want to connect to a computer remotely (from another computer) then you can use SSH
      Buy some server online and they send you default login information
      By default we use a password, but passwords can be easily hacked
      Instead we can use a SSH key (almost impossible to crack)

First, we need to login to the server with the default credentials:

ssh root@host


It will make us change the default password, just pick a normal password for now…

Close the terminal, that’s all we needed to do for now



On our own computer, generate our keys (we will have private key on our computer, and put public key on server)

ssh-keygen -t rsa

Press enter twice (you can protect with pass phrase if you want)

# Add our public key to the server

ssh-copy-id root@host


      Type your password to verify it’s you and then your public key will be added
      Close the Terminal and log back in using

ssh root@host


We don’t need a password anymore because the key private saved on our computer get verified by the server’s public key



For extra security, disable the password login (only our SSH key has access)

sudo nano /etc/ssh/sshd_config


# Modify this line

PermitRootLogin yes

PermitRootLogin without-password


CTRL+X > y > Enter

# Put changes into effect

reload ssh


If anyone else tries to login now (even if they guess the password right)it will say access denied



SFTP

sftp root@104.236.7.12


# Go to master directory on server

cd ../..

ls -la


# The homepage for your website is in this directory

cd var/www/html

ls -la




To upload files to the server

# Move back into html

cd html

put Desktop/index.html

Now go open and browser and view:

# To upload a whole directory

mkdir Books

put -r Desktop/Books





To Download Files

# Download a file from the server

get remoteFile localFile

get index.html Desktop/index.html


# To download an entire directory (and all contents)

cd ..

ls

get -r html Desktop






Proper way to install nvidia 390 fix error

Proper way to install nvidia 390 if you see any error in the process look below; command  sudo apt purge --autoremove '*nvidia*&#...