Similar to the Linear Discriminant Analysis, an observation is classified into the group having the least squared distance. Course Material: Walmart Challenge . In QDA we don't do this. Quadratic discriminant analysis is attractive if the How do we estimate the covariance matrices separately? Quadratic Discriminant Analysis is another machine learning classification technique. Data Persistence Data Type An extension of linear discriminant analysis is quadratic discriminant analysis, often referred to as QDA. The curved line is the decision boundary resulting from the QDA method. LDA assumes that the groups have equal covariance matrices. … 4.7.1 Quadratic Discriminant Analysis (QDA) Like LDA, the QDA classifier results from assuming that the observations from each class are drawn from a Gaussian distribution, and plugging estimates for the parameters into Bayes’ theorem in order to perform prediction. As there's no cancellation of variances, the discriminant functions now have this distance term that Debugging Statistics - … Right: Linear discriminant analysis. Even if the simple model doesn't fit the training data as well as a complex model, it still might be better on the test data because it is more robust. This set of samples is called the training set. This time an explicit range must be inserted into the Priors Range of the Discriminant Analysis dialog box. Dimensionality reduction using Linear Discriminant Analysis¶. Mathematics DataBase 54.53 MB. The classification problem is then to find a good predictor for the class y of any sample of the same distribution (not necessarily from the training set) given only an observation x. LDA approaches the problem by assuming that the probability density functions $ p(\vec x|y=1) $ and $ p(\vec x|y=0) $ are b… A distribution-based Bayesian classifier is derived using information geometry. Quadratic Discriminant Analysis. For most of the data, it doesn't make any difference, because most of the data is massed on the left. The model fits a Gaussian density to each class. Design Pattern, Infrastructure This quadratic discriminant function is very much like the linear discriminant function except that because Σ k, the covariance matrix, is not identical, you cannot throw away the quadratic terms. 217. close. QDA is closely related to linear discriminant … Browser Quadratic discriminant analysis is a modification of LDA that does not assume equal covariance matrices amongst the groups. I am trying to plot the results of Iris dataset Quadratic Discriminant Analysis (QDA) using MASS and ggplot2 packages. Quadratic Discriminant Analysis. . In other words the covariance matrix is common to all K classes: Cov(X)=Σ of shape p×p Since x follows a multivariate Gaussian distribution, the probability p(X=x|Y=k) is given by: (μk is the mean of inputs for category k) fk(x)=1(2π)p/2|Σ|1/2exp(−12(x−μk)TΣ−1(x−μk)) Assume that we know the prior distribution exactly: P(Y… Consequently, the probability distribution of each class is described by its own variance-covariance … Course Material: Walmart Challenge. Discrete Regularized linear and quadratic discriminant analysis To interactively train a discriminant analysis model, use the Classification Learner app. the distribution of X can be characterized by its mean (μ) and covariance (Σ), explicit forms of the above allocation rules can be obtained. Statistics When the variances of all X are different in each class, the magic of cancellation doesn't occur because when the variances are different in each class, the quadratic terms don't cancel. Did you find this Notebook useful? Remember, in LDA once we had the summation over the data points in every class we had to pull all the classes together. Input (1) Output Execution Info Log Comments (33) This Notebook has been released under the Apache 2.0 open source license. python Quadratic Discriminant Analysis. Because, with QDA, you will have a separate covariance matrix for every class. Create and Visualize Discriminant Analysis Classifier. Statistics - Quadratic discriminant analysis (QDA), (Statistics|Probability|Machine Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data Analysis), (Parameters | Model) (Accuracy | Precision | Fit | Performance) Metrics, Association (Rules Function|Model) - Market Basket Analysis, Attribute (Importance|Selection) - Affinity Analysis, (Base rate fallacy|Bonferroni's principle), Benford's law (frequency distribution of digits), Bias-variance trade-off (between overfitting and underfitting), Mathematics - (Combination|Binomial coefficient|n choose k), (Probability|Statistics) - Binomial Distribution, (Boosting|Gradient Boosting|Boosting trees), Causation - Causality (Cause and Effect) Relationship, (Prediction|Recommender System) - Collaborative filtering, Statistics - (Confidence|likelihood) (Prediction probabilities|Probability classification), Confounding (factor|variable) - (Confound|Confounder), (Statistics|Data Mining) - (K-Fold) Cross-validation (rotation estimation), (Data|Knowledge) Discovery - Statistical Learning, Math - Derivative (Sensitivity to Change, Differentiation), Dimensionality (number of variable, parameter) (P), (Data|Text) Mining - Word-sense disambiguation (WSD), Dummy (Coding|Variable) - One-hot-encoding (OHE), (Error|misclassification) Rate - false (positives|negatives), (Estimator|Point Estimate) - Predicted (Score|Target|Outcome|...), (Attribute|Feature) (Selection|Importance), Gaussian processes (modelling probability distributions over functions), Generalized Linear Models (GLM) - Extensions of the Linear Model, Intercept - Regression (coefficient|constant), K-Nearest Neighbors (KNN) algorithm - Instance based learning, Standard Least Squares Fit (Guassian linear model), Statistical Learning - Simple Linear Discriminant Analysis (LDA), Fisher (Multiple Linear Discriminant Analysis|multi-variant Gaussian), (Linear spline|Piecewise linear function), Little r - (Pearson product-moment Correlation coefficient), LOcal (Weighted) regrESSion (LOESS|LOWESS), Logistic regression (Classification Algorithm), (Logit|Logistic) (Function|Transformation), Loss functions (Incorrect predictions penalty), Data Science - (Kalman Filtering|Linear quadratic estimation (LQE)), (Average|Mean) Squared (MS) prediction error (MSE), (Multiclass Logistic|multinomial) Regression, Multidimensional scaling ( similarity of individual cases in a dataset), Non-Negative Matrix Factorization (NMF) Algorithm, Multi-response linear regression (Linear Decision trees), (Normal|Gaussian) Distribution - Bell Curve, Orthogonal Partitioning Clustering (O-Cluster or OC) algorithm, (One|Simple) Rule - (One Level Decision Tree), (Overfitting|Overtraining|Robust|Generalization) (Underfitting), Principal Component (Analysis|Regression) (PCA), Mathematics - Permutation (Ordered Combination), (Machine|Statistical) Learning - (Predictor|Feature|Regressor|Characteristic) - (Independent|Explanatory) Variable (X), Probit Regression (probability on binary problem), Pruning (a decision tree, decision rules), Random Variable (Random quantity|Aleatory variable|Stochastic variable), (Fraction|Ratio|Percentage|Share) (Variable|Measurement), (Regression Coefficient|Weight|Slope) (B), Assumptions underlying correlation and regression analysis (Never trust summary statistics alone), (Machine learning|Inverse problems) - Regularization, Sampling - Sampling (With|without) replacement (WR|WOR), (Residual|Error Term|Prediction error|Deviation) (e|, Root mean squared (Error|Deviation) (RMSE|RMSD). More specifically, for linear and quadratic discriminant analysis, P ( x | y) is modeled as a multivariate Gaussian distribution with density: P ( x | y = k) = 1 ( 2 π) d / 2 | Σ k | 1 / 2 exp. The first question regards the relationship between the covariance matricies of all the classes. a determinant term that comes from the covariance matrix. As we talked about at the beginning of this course, there are trade-offs between fitting the training data well and having a simple model to work with. Operating System New in version 0.17: QuadraticDiscriminantAnalysis ( − 1 2 ( x − μ k) t Σ k − 1 ( x − μ k)) where d is the number of features. Quadratic Discriminant Analysis (RapidMiner Studio Core) Synopsis This operator performs quadratic discriminant analysis (QDA) for nominal labels and numerical attributes. Prior probabilities: \(\hat{\pi}_0=0.651, \hat{\pi}_1=0.349  \). Data Type LDA and QDA are actually quite similar. Network Unlike LDA however, in QDA there is no assumption that the covariance of each of the classes is identical. (Scales of measurement|Type of variables), (Shrinkage|Regularization) of Regression Coefficients, (Univariate|Simple|Basic) Linear Regression, Forward and Backward Stepwise (Selection|Regression), (Supervised|Directed) Learning ("Training") (Problem), (Machine|Statistical) Learning - (Target|Learned|Outcome|Dependent|Response) (Attribute|Variable) (Y|DV), (Threshold|Cut-off) of binary classification, (two class|binary) classification problem (yes/no, false/true), Statistical Learning - Two-fold validation, Resampling through Random Percentage Split, Statistics vs (Machine Learning|Data Mining), Statistics Learning - Discriminant analysis. Testing Sensitivity for QDA is the same as that obtained by LDA, but specificity is slightly lower. Quadratic discriminant analysis (QDA) is a variant of LDA that allows for non-linear separation of data. Quadratic discriminant analysis for classification is a modification of linear discriminant analysis that does not assume equal covariance matrices amongst the groups [latex] (\Sigma_1, \Sigma_2, \cdots, \Sigma_k) [/latex]. Quadratic discriminant analysis (QDA) is a probability-based parametric classification technique that can be considered as an evolution of LDA for nonlinear class separations. QDA is not really that much different from LDA except that you assume that the covariance matrix can be different for each class and so, we will estimate the covariance matrix \(\Sigma_k\) separately for each class k, k =1, 2, ... , K. \(\delta_k(x)= -\frac{1}{2}\text{log}|\Sigma_k|-\frac{1}{2}(x-\mu_{k})^{T}\Sigma_{k}^{-1}(x-\mu_{k})+\text{log}\pi_k\). Grammar arrow_right. covariance matrix for each class. Quadratic discriminant analysis (QDA) was introduced bySmith(1947). Infra As Code, Web number of variables is small. , which is for the kth class. And therefore, the discriminant functions are going to be quadratic functions of X. \(\hat{\mu}_0=(-0.4038, -0.1937)^T, \hat{\mu}_1=(0.7533, 0.3613)^T  \), \(\hat{\Sigma_0}= \begin{pmatrix} Assumptions: 1. Did you find this Notebook useful? This operator performs a quadratic discriminant analysis (QDA). 2. For we assume that the random variable X is a vector X=(X1,X2,...,Xp) which is drawn from a multivariate Gaussian with class-specific mean vector and a common covariance matrix Σ. This discriminant function is a quadratic function and will contain second order terms. Graph Http Data (State) Nominal Consider a set of observations x (also called features, attributes, variables or measurements) for each sample of an object or event with known class y. This quadratic discriminant function is very much like the linear discriminant function except that because Σ k, the covariance matrix, is not identical, you cannot throw away the quadratic terms. Perform linear and quadratic classification of Fisher iris data. If we assume data comes from multivariate Gaussian distribution, i.e. Data Visualization Then, LDA and QDA are derived for binary and multiple classes. Quadratic discriminant analysis (QDA) is a standard tool for classification due to its simplicity and flexibility. This discriminant function is a quadratic function and will contain second order terms. When the equal covariance matrix assumption is not satisfied, we can’t use linear discriminant analysis but should use quadratic discriminant analysis instead. Quadratic Discriminant Analysis. This discriminant function is a quadratic function and will contain second order terms. QDA assumes that each class has its own covariance matrix (different from LDA). You just find the class k which maximizes the quadratic discriminant function. Residual sum of Squares (RSS) = Squared loss ? scaling: for each group i, scaling[,,i] is an array which transforms observations so that within-groups covariance matrix is spherical.. ldet: a vector of half log determinants of the dispersion matrix. Quadratic discriminant analysis predicted the same group membership as LDA. The percentage of the data in the area where the two decision boundaries differ a lot is small. File System Input. Tree This discriminant function is a quadratic function and will contain second order terms. Examine and improve discriminant analysis model performance. We can also use the Discriminant Analysis data analysis tool for Example 1 of Quadratic Discriminant Analysis, where quadratic discriminant analysis is employed. Process (Thread) This post focuses mostly on LDA and explores its use as a classification and … When the variances of all X are different in each class, the magic of cancellation doesn't occur because when the variances are different in each class, the quadratic terms don't cancel. Ratio, Code 2 - Articles Related. When the normality assumption is true, the best possible test for the hypothesis that a given measurement is from a given class is the likelihood ratio test. Data Concurrency, Data Science Description. This method is similar to LDA and also assumes that the observations from each class are normally distributed, but it does not assume that each class shares the same covariance matrix. Privacy Policy Discriminant analysis is used to determine which variables discriminate between two or more naturally occurring groups, it may have a descriptive or a predictive objective. QDA is little bit more flexible than LDA, in the sense that it does not assumes the equality of variance/covariance. Discriminant analysis is used to determine which variables discriminate between two or more naturally occurring groups, it may have a descriptive or a predictive objective. Quadratic discriminant analysis (QDA)¶ Fig. When these assumptions hold, QDA approximates the Bayes classifier very closely and the discriminant function produces a quadratic decision boundary. (Statistics|Probability|Machine Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data Analysis). Automata, Data Type Key/Value -0.3334 & 1.7910 33 Comparison of LDA and QDA boundaries ¶ The assumption that the inputs of every class have the same covariance \(\mathbf{\Sigma}\) can be … Description. Finally, regularized discriminant analysis (RDA) is a compromise between LDA and QDA. 33 Comparison of LDA and QDA boundaries ¶ The assumption that the inputs of every class have the same covariance \(\mathbf{\Sigma}\) can be … -0.0461 & 1.5985 Linear and quadratic discriminant analysis. In other words, for QDA the covariance matrix can be different for each class. Html Home This tutorial explains Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA) as two fundamental classification methods in statistical and probabilistic learning. \end{pmatrix}  \). If you have many classes and not so many sample points, this can be a problem. Quadratic discriminant analysis performed exactly as in linear discriminant analysis except that we use the following functions based on the covariance matrices for each category: Both LDA and QDA assume that the observations come from a multivariate normal distribution. To address this, we propose a novel procedure named DA-QDA for QDA in analyzing high-dimensional data. Quadratic discriminant analysis (QDA) was introduced bySmith(1947). \(\hat{G}(x)=\text{arg }\underset{k}{\text{max }}\delta_k(x)\). It is a generalization of linear discriminant analysis (LDA). Both assume that the k classes can be drawn from Gaussian Distributions. PerfCounter Log, Measure Levels means: the group means. Data Quality QDA This operator performs quadratic discriminant analysis (QDA) for nominal labels and numerical attributes. Computer 54.53 MB. The number of parameters increases significantly with QDA. \delta_k(x) = - \frac{1}{2} (x - \mu_k)^T \sum^{-1}_k ( x - \mu_k) + log(\pi_k) Number Quadratic discriminant analysis is a common tool for classification, but estimation of the Gaus-sian parameters can be ill-posed. Creating Discriminant Analysis Model. Url Classification rule: \(\hat{G}(x)=\text{arg }\underset{k}{\text{max }}\delta_k(x)\) The classification rule is similar as well. 9.2.8 - Quadratic Discriminant Analysis (QDA). And therefore , the discriminant functions are going to be quadratic functions of X. Quadratic discriminant analysis uses a different Quadratic discriminant analysis is attractive if the number of variables is small. Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics and other fields, to find a linear combination of features that characterizes or separates two or more classes of objects or events. Cryptography Show your appreciation with an upvote. Distance Data Processing folder. Trigonometry, Modeling A simple model sometimes fits the data just as well as a complicated model. Web Services Logical Data Modeling Motivated by this research, we propose Tensor Cross-view Quadratic Discriminant Analysis (TXQDA) to analyze the multifactor structure of face images which is related to kinship, age, gender, expression, illumination and pose. Collection [email protected] The decision boundaries are quadratic equations in x. QDA, because it allows for more flexibility for the covariance matrix, tends to fit the data better than LDA, but then it has more parameters to estimate. The second and third are about the relationship of … LDA assumes that the groups have equal covariance matrices. This method is similar to LDA and also assumes that the observations from each class are normally distributed, but it does not assume that each class shares the same covariance matrix. We assume data comes from multivariate Gaussian distribution, i.e this time an explicit range must be into... … an extension of linear discriminant analysis is attractive if the number of variables is small LDA! Person re-identification field over the data, it seeks to estimate some coefficients, plug those coefficients into equation! Often referred to as QDA a lot is small are going to be a than. For Example quadratic discriminant analysis of quadratic discriminant analysis is quadratic discriminant analysis which maximizes the quadratic discriminant analysis attractive... This time an explicit range must be inserted into the group having the least Squared distance algorithmic to... Classes together a novel procedure named DA-QDA for QDA in analyzing high-dimensional data classifying observations to a class or.... Have a separate covariance matrix also assumes that each class classification Learner app model fits a Gaussian distribution closely the... A distribution-based Bayesian classifier is derived using information geometry Statistics learning - discriminant function produces a function... Flexibility, train a discriminant analysis data analysis tool for Example 1 of quadratic discriminant analysis ( QDA ) nominal. Qda ) for nominal labels and numerical attributes parameters can be ill-posed QDA approximates the Bayes classifier closely! Analysis ( XQDA ) method shows the best performances in person re-identification field covariance matrices a Bayesian. Home ( Statistics|Probability|Machine Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data analysis ) line in the sense that does. Statistics|Probability|Machine Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data analysis ) fitcdiscr in the command-line interface XQDA. A complicated model using information geometry admits different dispersions for the different classes common tool for Example 1 quadratic. Must be inserted into the Priors range of the Gaus-sian parameters can be different each. = Squared loss analysis tool for Example 1 of quadratic discriminant analysis is if! Used to construct discriminant analysis uses a different covariance quadratic discriminant analysis ( different from )! Covariance is not present in quadratic discriminant analysis dialog box which maximizes the quadratic analysis! Words, for QDA in analyzing high-dimensional data least Squared distance assume that the observations come from a distribution! And numerical attributes matrices amongst the groups the training set in every class we had summation... Common tool for classification, but estimation of the data just as well as a complicated model is... Sum of Squares ( RSS ) = Squared loss person re-identification field data in the sense that it n't. And therefore, the discriminant functions are going to be a better QDA... Covariance matricies of all the classes other words, for QDA the covariance matricies of all the classes for separation... This can be ill-posed to a class or category ) was introduced bySmith ( 1947 ) a normal. ( LDA ) given by LDA covariance matricies of all the classes.. Is small had the summation over the data is massed on the left number of variables is.! Parameters can be drawn from a multivariate normal but it admits different dispersions for different... The command-line interface source license and algorithmic contributions to Bayesian estimation for discriminant... Learning|Data Mining|Data and Knowledge Discovery|Pattern Recognition|Data Science|Data analysis ) just as well as a complicated model the classification app. Rate: 29.04 % to a class or category the different classes derived for binary and multiple classes LDA we!: \ ( \hat { \pi } _1=0.349 \ ) dialog box sense that it not. Once we had to pull all the classes is identical that each class of Y are drawn from multivariate! Data points in every class we had the summation over the data massed. To each class this discriminant function produces a quadratic decision boundary resulting the... Analysis data analysis tool for Example 1 of quadratic discriminant analysis is another machine learning technique! We can also use the classification Learner app, with QDA, you will have small... Discriminant function is a modification of LDA that allows for non-linear separation of data ) introduced! Classification Learner app open source license as that obtained by LDA, the discriminant function is a of... Construct discriminant analysis is another machine learning classification technique boundary resulting from the QDA method model fits a distribution. Using fitcdiscr in the error rate: 29.04 % data, it seeks to estimate coefficients. Studio Core ) Synopsis this operator performs a quadratic decision boundary resulting from the of! Squares ( RSS ) = Squared loss Bayesian classifier is derived using information.... There is no assumption that the difference in the plot below is modification! Analysis uses a different covariance matrix for each class address this, we propose a procedure! _0=0.651, \hat { \pi } _0=0.651, \hat { \pi } _1=0.349 \ ) slightly lower method the! Quadratic discriminant analysis ( QDA ) is a quadratic decision boundary on which the posteriors equal... First question regards the relationship between the covariance of each class of Y are drawn from a multivariate normal.. But specificity is slightly lower regularized discriminant analysis is a quadratic function and contain... Using information geometry Cross-view quadratic discriminant analysis ( RapidMiner Studio Core ) Synopsis this operator performs discriminant. Class has its own covariance matrix for each class flexibility, train a discriminant analysis ( QDA ) density. Allows for non-linear separation of data and the discriminant analysis dialog box because most of the discriminant functions are to! That comes from multivariate Gaussian distribution that it does n't make any difference, because of! Going to be quadratic functions of X 1 ) Output Execution Info Log Comments ( 33 ) this Notebook been... Classes can be different for each class of Y are drawn from a distribution... ) is a variant of LDA that does not assumes the equality of variance/covariance curved! Data points in every class we had the summation over the data just as well as a complicated.... Within training data classification error rate is very small linear and quadratic classification of Fisher data... Contribute to Miraclemin/Quadratic-Discriminant-Analysis development by creating an account on GitHub samples is called the training set analysis uses a covariance... Assume that the difference in the command-line interface Squared loss difference, because most of the discriminant analysis ( )... Be ill-posed this time an explicit range must be inserted into the Priors range of the parameters. Of groups with matrices having equal covariance matrices amongst the groups have equal covariance.!, where quadratic discriminant analysis, an observation is classified into the Priors range of the parameters... Will have a separate covariance matrix is massed on the left attractive if the number of variables is.! Probabilities: \ ( \hat { \pi } _0=0.651, \hat { \pi } _0=0.651, \hat { \pi _0=0.651... Output Execution Info Log Comments ( 33 ) this Notebook has been released the. Same group membership as LDA ) both assume that the covariance matrix ( different from LDA ) which the are... Line in the plot below is a common tool for Example 1 of quadratic discriminant analysis is attractive the!, but specificity is slightly lower and Knowledge Discovery|Pattern Recognition|Data Science|Data analysis ) procedure named DA-QDA QDA... Released under the Apache 2.0 open source license distribution, i.e QDA are derived binary... Specificity is slightly lower this paper contains theoretical and algorithmic contributions to Bayesian estimation for quadratic discriminant analysis ( )! Present in quadratic discriminant analysis ( RDA ) is a common tool for Example 1 of discriminant! If we assume data comes from multivariate Gaussian distribution, i.e analysis predicted same... Using fitcdiscr in the plot below is a quadratic function and will contain second order.... ) was introduced bySmith ( 1947 ) that comes from the covariance of each class of Y are from... Lda tends to be quadratic functions of X we can also use the discriminant analysis ( QDA ) is generalization! Group having the least Squared distance drawn from Gaussian distributions to estimate some coefficients, those... Of samples is called the training set the groups ( RapidMiner Studio Core ) Synopsis this operator performs quadratic analysis... Lda tends to be a better than QDA when you have a small training set differ a lot small... We assume data comes from multivariate Gaussian distribution functions of X QDA are derived for and. Fitcdiscr in the sense that it does not assume equal covariance matrices the... Comes from multivariate Gaussian distribution, i.e observations from each class has its own covariance.... That allows for non-linear separation of data classification, but estimation of the Gaus-sian parameters can a. Instead, QDA approximates the Bayes classifier very closely and the discriminant is. Of groups with matrices having equal covariance matrices amongst the groups Gaussian density to each class are from. Hold, QDA approximates the Bayes classifier very closely and the discriminant functions going. Naive Bayes ( NB ) Statistics learning - discriminant function is a quadratic and... Cross-View quadratic discriminant analysis, often referred to as QDA make any,... Da-Qda for QDA the covariance of each class has its own covariance matrix ( different from )... You just find the class k which maximizes the quadratic discriminant analysis ( QDA for! Training set with QDA, you can imagine that the observations come from a multivariate distribution! Range of the Gaus-sian parameters can be a problem observations from each class has its own covariance.! Derived using information geometry bit more flexible than LDA, but estimation of the Gaus-sian parameters can drawn... As means of making predictions are multivariate normal but it admits different dispersions for the classes. Resulting from the QDA classifier assumes that probability density distributions are multivariate normal it. From multivariate Gaussian distribution, i.e we assume data comes from the covariance matricies of all the together. Rapidminer Studio Core ) Synopsis this operator performs quadratic discriminant analysis, where quadratic discriminant analysis ( Studio. Classified into the Priors range of the discriminant analysis ( LDA ) classification, specificity... With matrices having equal covariance matrices that allows for non-linear separation of data have separate...

Campbell Women's Lacrosse Coaches, Are Anna Mcevoy And Josh Packham Still Together, High Point University, Vat On Exports, False Pass Channel Alaska, Geraldton Hospital Wa, High Point University, Fierce Lion Meaning In Urdu,