Thursday, October 31, 2019
Search Engine Optimization Research Paper Example | Topics and Well Written Essays - 750 words
Search Engine Optimization - Research Paper Example This is to mean that, white hats use terms appropriately to ensure that their websites are visible. On the other hand, black fill their websites with content or words that get plenty of hits from search engines. White hats, when compared to black hats have distinct characteristics that put them apart in favor or against one another as one of the uses unscrupulous means and makes attractive promises (Creative Momentum 1). In relation to white hat search engine optimization, it follows regulations and guidelines to the letter, as provided by search engines to ensure that websites are visible to all that require information from them. This includes careful use of words that get numerous hits from search engines such that the words used are not meant to bloat documents or articles in the website. Instead, the words used are of a unique nature in that they are informative and, at times, act as support words for the whole article and not for exploiting the abilities of search engines. As s uch, websites are designed with certain aspects in mind; these include rules and norms, as well as a code of ethics to ensure that all internet activities are conducted with transparency. Based on this, following the guidelines of search engines for optimizing visibility, ensures that the ever-changing algorithms applied by search engines (Revenuewire 2) achieve transparency. This shields legal or appropriate sites from relegation, and thus gets high ratings by following meeting the terms of use on multiple levels. The white hat optimizers or designers, therefore, apply to the intelligence and needs of internet users to deliver quality content that does not fool search engines by giving false information, including in the titles of the websites. In relation to domain naming, white hat optimizers allow the actual intent of the website to appear rather than mislead the search engine user into logging into it for irrelevant content. These are the distinct characteristics of ââ¬Å"whit e hatâ⬠search engine optimization due to their compliant nature, which offers protection from removal and relegation. On the other hand, black hat search engine optimizers have their own code of ethics, or lack of it thereof that sees them dump the rulebook and carry the day, but only for limited moment (Smith 5). Black hat optimization in search engines involves the application of hidden content, where there are numerous links and articles, as well as documents that are spammed with keywords, which serve as crucial gateway points for search engines. This is in relation to search engines accessing the site due to its heavy saturation with such words that are mainly used by internet users to query information from the internet. Information is usually hidden in scripts and non-script parts of code in a website, where search engines only expose it as its algorithms crawl the internet. In addition black hat optimizations tends to have meta-keyword stuffing, where in the descriptio n of the website, or even in its headers, it has numerous keywords (Wilding). These words raise the flags, both red and white for search engines thus improving the ranking of a website, especially in marketing and sites that market content to generate revenue. Through this, it is evident that black hat optimizers are only
Tuesday, October 29, 2019
Relationship Marketing Through a Number of Pathways Research Paper
Relationship Marketing Through a Number of Pathways - Research Paper Example Citigroupââ¬â¢s relationship managers have realized that the Company met all these three aspects. First of all, the Company offers both investment and commercial services which means that clients have a lot of services at their disposal. Some of them include; Mortgages, priority banking for high net worth clients, loans, investment banking, telephone banking and card products. Secondly, clients in the banking sector are in continuous need of these services. Lastly, those customers who decide to do business with the Company normally select one service or product. In the case of Citigroup, some customers strictly come to obtain loans, some would like to save their money there but access it conveniently when the need arises (commercial services and care facilities). The organization has implemented relationship marketing through consumer tracking. Since Citigroup is B2B backed, then it was able to use a comprehensive database to analyze what consumer tastes and preferences are. It ha s been a leader in business communication with the client. Citigroup has been asking its clients about their thoughts on the institution. It found that certain services were preferred over others. It also realized that consumer kept coming back for certain products during definite seasons. It was able to establish a pattern and focused its energies on products that gave them maximum returns. This was also topped up by improving services that clients were dissatisfied with. One such service was the provision of housing loans. The Company found out that many clients were happy with their rate of loan processing.
Sunday, October 27, 2019
Wavelet Packet Feature Extraction And Support Vector Machine Psychology Essay
Wavelet Packet Feature Extraction And Support Vector Machine Psychology Essay ABSTRACT- The aim of this work is an automatic classification of the electroencephalogram (EEG) signals by using statistical features extraction and support vector machine. From a real database, two sets of EEG signals are used: EEG recorded from a healthy person and from an epileptic person during epileptic seizures. Three important statistical features are computed at different sub-bands discrete wavelet and wavelet packet decomposition of EEG recordings. In this study, to select the best wavelet for our application, five wavelet basis functions are considered for processing EEG signals. After reducing the dimension of the obtained data by linear discriminant analysis and principal component analysis, feature vectors are used to model and to train the efficient support vector machine classifier. In order to show the efficiency of this approach, the statistical classification performances are evaluated, and a rate of 100% for the best classification accuracy is obtained and is compa red with those obtained in other studies for the same data set. Keywords- EEG; Discrete Wavelet Transform, Wavelet Packet Transform, Support Vector Machine, Statistical analysis, classification. 1. Introduction In neurology, the electroencephalogram (EEG) is a non-invasive test of brain function that is mostly used for the diagnosis and classification of epilepsy. The epilepsy episodes are a result of excessive electrical discharges in a group of brain cells. Epilepsy is a chronic neurological disorder of the brain that affects over 50 million people worldwide and in developing countries, three fourths of people with epilepsy may not receive the treatment they need [1]. In clinical decisions, the EEG is related to initiation of therapy to improve quality of epileptic patients life. However, EEG signals occupy a huge volume and the scoring of long-term EEG recordings by visual inspection, in order to classify epilepsy, is usually a time consuming task. Therefore, many researchers have addressed the problem of automatic detection and classification of epileptic EEG signals [2, 3]. Different studies have shown that EEG signal is a non-stationary process and non-linear features are extracted fr om brain activity recordings in order to specific signal characteristics [2, 4, 5, 6]. Then these features are used as input of classifiers [11]. Subasi in [7] used the discrete wavelet transform (DWT) coefficient of normal and epileptic EEG segments in a modular neural network called mixture of expert. For the same EEG data set, Polat and Gà ¼nes [8] used the feature reduction methods including DWT, autoregressive and discrete Fourier transform. In Subasi and Gursoy [9], the dimensionality of the DWT features was reduced using principal component analysis (PCA), independent component analysis (ICA) and linear discriminant analysis (LDA). The resultant features were used to classify normal and epilepsy EEG signals using support vector machine. Jahankhani, Kodogiannis and Revett [10] have obtained feature vectors from EEG signals by DWT and performed the classification by multilayer perceptron (MLP) and radial basis function network. Wavelet packet transform (WPT) appears as one of most promising methods as shown by a great number of works in the literature [11] particularly for ECG signals and relatively fewer, for EEG signals. In [12], Wang, Miao and Xie used wavelet packet entropy method to extract features and K-nearest neighbor (K-NN) classifier. In this work, both DWT and WPT split non stationary EEG signals into frequency sub-bands. Then a set of statistical features such as standard deviation, energy and entropy from real database EEG recordings were computed from e ach decomposition level to represent time-frequency distribution of wavelet coefficients. LDA and PCA are applied to these various parameters allowing a data reduction. These features were used as an input to efficient SVM classifier with two discrete outputs: normal person and epileptic subject. A measure of the performances of these methods is presented. The remaining of this paper is organized as follows: Section 2 describes the data set of EEG signals used in our work. In Section 3, preliminaries are presented for immediate reference. This is followed by the step up of our experiments and the results in section 4. Finally, some concluding remarks are given in Section 5. 2. DATA SELECTION We have used the EEG data taken from the artifact free EEG time series database available at the Department of Epileptology, University of Bonn [23]. The complete dataset consists of five sets (denoted A-B-C-D-E). Each set contains100 single-channel EEG signals of 23,6s. The normal EEG data was obtained from five healthy volunteers who were in the relaxed awake state with their eyes open (set A). These signals were obtained from extra-cranially surface EEG recordings in accordance with a standardized electrode placement. Set E contains seizure activity, selected from all recording sites exhibiting ictal activity. All EEG signals were recorded with the same 128 channel amplifier system and digitized at 173.61Hz sampling. 12 bit analog-to-digital conversion and band-pass (0.53-40 Hz) filter settings were used. For a more detailed description, the reader can refer to [13]. In our study, we used set A and set E from the complete dataset. Raw EEG signal Feature extraction: Energy, Entropy and Standard deviation from DWT and WPT decom-position coefficients Dimensionality reduction by LDA and PCA Classification and Performance measure Healthy Epileptic Figure 1 The flow chart of the proposed system 3. methods The proposed method consists of three main parts: (i) statistical feature extraction from DWT and from WPT decomposition coefficients, (ii) dimensionality reduction using PCA and LDA, and (iii) EEG classification using SVM. The flow chart of the proposed method is given in figure 1. Details of the pre-processing and classification steps are examined in the following subsections. 3.1 Analysis using DWT and WPT Since the EEG is a highly non-stationary signal, it has been recently recommended the use of time-frequency domain methods [14]. Wavelet transform can be used to decompose a signal into sub-bands with low frequency (approximate coefficients) and sub-bands with high frequency (detailed coefficients) [15, 16, 17]. Under discrete wavelet transform (DWT), only approximation coefficients are decomposed iteratively by two filters and then down-sampled by 2. The first filter h[.] is a high-pass filter which is the mirror of the second low pass filter l[.]. DWT gives a left recursive binary tree structure. We processed 16 DWT coefficients. Wavelet packet transform (WPT) is an extension of DWT that gives a more informative signal analysis. By using WPT, the lower, as well as the higher frequency bands are decomposed giving a balanced tree structure. The wavelet packet transform generates a full decomposition tree, as shown in figure 2. In this work, we performed five-level wavelet packet deco mposition. The two wavelet packet orthogonal bases at a parent node (i, p) are obtained from the following recursive relationships Eq. (1) and (2), where l[n] and h[n] are low (scale) and high (wavelet) pass filter, respectively; i is the index of a subspaces depth and p is the number of subspaces [15]. The wavelet packet coefficients corresponding to the signal x(t) can be obtained from Eq. (3), l (3,0) (3,1)â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦(3,6) (3,7) h l h l h l h h l h l h l SIGNAL (0,0) (1,0) (1,1) (2,0) (2,1) (2,2) (2,3) Figure 2 Third level wavelet packet decomposition of EEG signal Table 1 gives the frequency bands for each level of WPT decomposition. Figures 3 and 4 show the fifth level wavelet packet decomposition of EEG segments, according to figure 2. We processed 32 WPT coefficients. Therefore, in this study, three statistical parameters: energy feature (En), the measure of Shannon entropy (Ent) and standard deviation (Std) are computed, (4) (5) (6) 3.2 Principal component analysis To make a classifier system more effective, we use principal component analysis (PCA) for dimensionality reduction. The purpose of its implementation is to derive a small number of uncorrelated principal components from a larger set of zero-mean variables, retaining the maximum possible amount of information from the original data. Formally, the most common derivation of PCA is in terms of standardized linear projection, which maximizes the variance in the projected space [18, 19]. For a given p-dimensional data set X, the m principal axes W1,â⬠¦,Wm where 1âⰠ¤ mâⰠ¤ p, are orthogonal axes onto which the retained variance is maximum in the projected space. Generally, W1,â⬠¦,Wm can be given by the m leading eigenvectors of the sample Table1 Frequency band of each wavelet decomposition level. Decomposition level Frequency band (Hz) 1 2 3 4 5 0-86.8; 86.8-173.6 0-43.5; 43.5-86.8; 86.3-130.2 ;130.2-173.6 0-21.75; 21.75-43.5; 43.5-54.375; 54.375-86.3; 86.3-108.05; 108.05-130.2; 130.2 130.2-151.95; 151.95-173.6; 0-10.875; 10.875-21.75; 21.75-32.625; 32.625-43.5; 43.5-54.375; 54.375-65.25; 65.25-76.125; 76.125-87; 87-97.875; 97.875-108.75; 108.75-119.625; 119.625-130.5; 130.5-141.375; 141.375-152.25; 152.25-163.125; 163.125-173.6 0-5.44; 5.44-10.875; 10.875-16.31; 16.31-21.75: 21.75-27.19; 27.19-32.625; 32.625-38.06; 38.06-43.5; 43.5-48.94; 48.94-54.375; 54.375-59.81; 59.81-65.25; 65.25-70.69; 70.69-76.125; 76.125-81.56;81.56-87; 87-92.44; 92.44-97.87; 97.87-103.3; 103.3-108.75; 108.75-114.19; 114.19-119.625; 119.625-125.06; 125.06-130.5; 130.5-135.94; 135.94-141.38; 141.38-146.81; 146.81-152.25; 152.25-157.69; 157.69-163.125; 163.125-168.56; 168.56-173.6 covariance matrix where is the sample mean and N is the number of samples, so that SWi= à »iWi, where à »i is the ith largest eigenvalue of S. The m principal components of a given observation vector xi are given by the reduced feature vector . 3.3 Linear discriminant analysis Linear discriminant analysis (LDA) projects high-dimensional data onto a low-dimensional space where the data can achieve maximum class separability [19]. The aim of LDA is to create a new variable that is a combination of the original predictors, i.e. the derived features in LDA are linear combinations of the original variables, where the coefficients are from the transformation matrix i.e. LDA utilizes a transformation matrix W, which can maximizes the ratio of the between-class scatter matrix SB to the within-class scatter matrix SW, to transform the original feature vectors into lower dimensional feature space by linear transformation. The linear function y= WTx maximizes the Fisher criterion J(W) [19], where xj(i) represents the jth sample of the ith of total c classes. k is the dimension of the feature space, and à µi is the Figure 3 Fifth level wavelet packet decomposition of healthy EEG signal (set A). Figure 4 Fifth level wavelet packet decomposition of epileptic EEG signal (set E). mean of the ith class. Mi is the number of samples within classes i in total number of classes. where is the mean of the entire data set. As a dimensionality reduction method, LDA has also been adopted in this work. 3.4 SVM classifier In this work, SVM [20] has been employed as a learning algorithm due to its superior classification ability. Let n examples S={xi,yi}i=1n, yià à µ{-1,+1}, where xi represent the input vectors, yi is the class label. The decision hyperplane of SVM can be defined as (w, b); where w is a weight vector and b a bias. The optimal hyperplane can be written as, where w0 and b0 denote the optimal values of the weight vector and bias. Then, after training, test vector is classified by decision function, To find the optimum values of w and b, it is required to solve the following optimization problem: subject to where à ¾i is the slack variable, C is the user-specified penalty parameter of the error term (C>0), and Ãâ the kernel function [21]. A radial basis function (RBF) kernel defined as, was used, where ÃÆ' is kernel parameter defined by the user. 4. results and discussion Before we give the experimental results and discuss our observations, we present three performance measures used to evaluate the proposed classification method. (i) Sensitivity, represented by the true positive ratio (TPR), is defined as (ii) Specificity, represented by the true negative ratio (TNR), is given by, (iii) and average classification accuracy is defined as, (16) where FP and FN represent false positive and false negative, respectively. All the experiments in this work were undertaken over 100 segments EEG time series of 4096 samples for each class set A and set E. There were two diagnosis classes: Normal person and epileptic patient. To estimate the reliability of the proposed model, we utilize ten-fold cross validation method. The data is split into ten parts such that each part contains approximately the same proportion of class samples as in the classification dataset. Nine parts (i.e. 90%) are used for training the classifier, and the remaining part (i.e. 10%) for testing. This procedure is repeated ten times using a different part for testing in each case. As illustrated in Fig.3 and 4, feature vectors were computed from coefficient of EEG signals. Taking energy as feature vector, figure 5 shows that the features of both normal and epileptic EEG signals are mixed. The proposed analysis using wavelets was carried out using MATLAB R2011b. In literature, there is no common suggestion to select a particular wavelet. Therefore, a very important step before classifying EEG signals is to select an appropriate wavelet for our application. Then, five wavelet functions namely Daubechies, Coiflets, Biorthogonal, Symlets and Discrete Meyer wavelets are examined and compared, in order to evaluate the performance of various types of wavelets. Figure 6 shows accuracy, sensitivity and specificity from different wavelets. We see that the best wavelet giving good correct rate is the Db2, Db4, coif3 and Bior1.1.The choice of the mother wavelet is focused on daubechies where the length of the filter is 2N, while coifflet wavelet filter is 6N and biorthogonal wavelet (2N +2). After EEG signal Db2 wavelet decomposition and dimensionality reduction, results of correct rate classification are showed in Table 2. The classification accuracy varies from the optimum value (100%) to a lowest value (87%). The results using standard deviation are the best results obtained and using entropy is better than using energy in EEG signals classification. In this study, experimental results show that linear discriminant analysis based on wavelet packet decomposition improves classification and the optimum SVM results are obtained by using standard deviation feature computed from wavelet packet coefficient and LDA reduction method. For this proposed scheme, the accuracy of the classification is 100%. This method presents a novel contribution and has not yet been presented in the literature. Figure 7 shows the average rate of classification (accuracy, sensitivity, specificity) obtained with different methods of decomposition (DWT or WPT), two reduction methods (LDA or PCA) and three characteristic features (standard deviation, energy, entropy) using the four best wavelet (Db2, Db4, coif3 and Bior1.1). We see that the combination of LDA with standard deviation have an optimum average accuracy rate of 99.90% and combination of standard deviation with PCA reaches 99.50 %. Table 3 gives a summary of the accuracy results obtained by other studies from the same dataset (set A and set E) using extraction of features from EEG signal and their classification. 5. conclusion In this paper, EEG signals were decomposed into time-frequency representations using discrete wavelet transform, wavelet packet transform and statistical features were Figure 5 Energy feature vector coefficient D3versus D2 (adapted from [22]). Table 3 Epilepsy classification accuracies evaluation obtained in literature from the same data sets Authors Method Accuracy (%) [7] Subasi DWT + Mixture of Expert 94.50 [8] Polat and Gà ¼nes DWT+DFT+ Auto-regres-sive model + Decision Tree 99.32 [9] Subasi and Gursoy DWT+PCA+ LDA+ICA +SVM 98.75(PCA) 100(LDA) 99.5(ICA) [12] Wang, Miao and Xie WPT+ Entropy-hierarchical K-NN classification 99,44 [14] ÃÅ"beylà ¯ Burg autoregressive + LS-SVM 99.56 Our method WPT + Standard deviation+ LDA + SVM 100 computed to represent their distribution. The most suitable mother wavelets for feature extraction and classification were found. The selection of the suitable mother wavelet and using reduction methods lead to the improvement of performance of EEG signal classification. It has been shown by experiments that for the SVM and the combination of the standard deviation with LDA have the highest correct classification rate of 100% in comparison with other techniques. The interest in expert systems for detection and classification of epileptic EEG signal is expected to grow more and more in order to assist and strengthen the neurologist in numerous tasks, especially, to reduce the number of selection for classification performance. These promising results encourage us to continue with more depth our study and to apply it to other databases recorded with other diseases.
Friday, October 25, 2019
Hertzsprung-russell Diagram :: essays research papers
The Hertzsprung-Russell Diagram or, the H-R Diagram for short, is a graph which plots stars according to their temperature and absolute magnitude. This graph reveals a pattern, which in fact is quite interesting. The H-R Diagram is named for the two astronomers, Ejnar Hertzsprung and Henry Russell, who discovered this pattern of stars. These two astronomers independently discovered that comparing magnitudes and spectral class (color) of stars yielded a lot of information about them. One key purpose of the H-R diagram is to show the relationship between temperature and absolute magnitude of stars. The type of temperature measurement used is Kelvin, where the zero point is equal to -273.16 C. On the H-R Diagram, the temperature of degrees Kelvin ranges from 3,000 to 30,000. The absolute magnitude of stars on the H-R Diagram ranges from +15 to -10. Absolute magnitude is how bright stars would appear if they were positioned at 32.6 light years away from earth. On this scale, the lower the number, the brighter the star. Thus, a start with an absolute magnitude of -10 would be much brighter than a star with an absolute magnitude of +15. The two astronomers found many patterns after developing their graph. They found that 90% of stars graphed fell within a band that ran through the middle of the graph. These stars range from cool, dim, red stars at the lower right of the H-R Diagram to hot, bright, blue stars at the upper left corner of the H-R Diagram. The stars that fall into to this band are known as main-sequence stars. Stars such as the sun, and almost every start visible in the night sky fall within this band of main sequence stars. There is another group of stars which are cool and bright that appear near the upper right corner of the H-R Diagram. These stars are very large and therefore have very big surface areas. These large surface areas give off large amounts of light and this makes the stars bright. Most of these stars are known as red giants. Some are so large however that they are referred to as supergiants. Red giants have a temperature of about 3,500 degrees Kelvin and an absolute magnitude of around 0. Supergiants have a temperature of around 3,000 degrees Kelvin and an absolute magnitude of about -7. Another group of stars, which are rather small, is found near the bottom left of the H-R Diagram.
Thursday, October 24, 2019
The Methods of American Business in Early 20th Century
A little more than a hundred years earlier, United States was an isolated country and the attitude kept developing during that period. The Senate did not want to ratify the Versailles Peace Treaty that ended the First World War and went to an extent that the country did not even join the League of Nations. Free migration into the country that existed earlier was stopped, business tariffs for imports were increased, and migration from Asia was practically stopped. Yet, the country had some special capabilities and one of that were the regularities in the modes of production in the country. The production in America made simpler and rougher goods, used much less of skilled labor since machines and organizations succeeded in taking over a lot of their responsibilities. (Delong 1997) Thus some methods of American business had been developed even before the start of the twentieth century and this may have given the country the lead. It is difficult to ascribe to any particular reason the changes that took place in the American economy during the period before the Second World War America was not one of the leaders of the Western World then as can be seen from the fact that both the World Wars were started by European countries and fought for quite some time by those countries, and America only entered the conflict when it was felt that the tradition of democracy was about to be lost. Yet, there were presidents like Hoover who felt that decline of economy was hurting the American labor. This view had also been taken in 1917 when the government had decided to nationalize the single largest of American industries at that time ââ¬â the railroads. (Vedder 1997) To a certain extent, the matters of politics and industrial changes were related and the biggest amount of relationship is found in the case of election of one of the most charismatic of American presidents, Frederick D Roosevelt. One of the reasons for his victory at the elections in 1936 is said to be the support given to him by the American Labor. This encouraged the labor union in the hands of CIO to seek more power and even challenge the authority of one of the country's most powerful companies, General Motors. During this period, the company was one of the most profitable and probably the largest organization in the country. This was recognized by the magazines of the time also. The company had 110 manufacturing plants situated all across the country, employed over 250,000 people and was owned by more than 500,000 shareholders. Yet, the attitude of the labor unions irritated the management and made them hostile to both the unions and the New Deal. At the same time, there was the Second World War in progress. (Lichtenstein 2003) Thus to an extent, the development of industry was being hindered through political ideas, but it is difficult to say who won. After the passing away of Roosevelt, the attitudes changed and the unions ended up loosing most of their power. That was also probably due to the conflict of the American system with the Russians who had come up as the most powerful country in the continent of Europe. The growth of the industries of the country was of industrial products and the chief among them was the automobiles followed closely by radios, consumer appliances and development of suburbs. The situation can be understood when we understand that the country had enough vehicles on the road to say that it had more than one vehicle for every five in the population. This is an achievement which many countries cannot state even today. The reason for the development of the country was mass production and that also made it the richest society existing in the world. (Delong 1997) Though all the inventions were not made in the country, but it made sure that large numbers were produced here as it had both the capacity to produce and the purchasers for the goods. It would be wrong to say that there was importance only of production for the War, but production had started earlier. The thought behind this development was the thinking of the major leaders of American business like Henry Ford, Thomas A. Edison, Edward Filenes, and George Swope of General Electric among others. The depression due to the stock market crash in 1929 was hurting people of the country and the president was viewed as a successful business person and he was trying to persuade business to provide help to the people of the country through more employment. (Vedder 1997) Thus it is difficult to say that production for war had any major impact on development of business prior to the start of the War. At the same time, after the production capacities were built up for the war, then one of the major questions was the utilization for utilization of this capacity after the War. This problem had also been seen in 1929 when part of the reasons for the crash was a drop of requirement of goods and services apart from the rather insane growth of the stock market. This was solved through the Marshall Plan and other methods. It helped America to keep on producing against loans to be repaid by much poorer countries, some of whom were never able to repay. The changes in domestic economy over the century have not been remarkable and practices of American consumers have not changed much. There are a lot of realistic impulses within the American consumer which is tied up with the idealism that is sought to be promoted. It contains emotions for freedom and self-fulfillment as the country started with that dream, but, at the same time, many Americans found it difficult to pay for those dreams from their savings and yet required the items as they were felt to be the basis for their identity. This was not accepted in many of the religions that the country started with, but even before the start of the twentieth century it was estimated that the domestic citizens of the country had an eleven trillion dollar of loans in private debt. (Horowitz 2003) This was spread among different people like the urban working class having loans with pawnbrokers, agencies providing small loans and retailers selling goods on installments. Even for building houses, loans were taken from building and loan associations which had to be paid over a period of five years. In spite of the fact that this facility of loans provided a lot of help to individuals in purchases, yet many traditional social workers, economists, clergy, bankers, retailers and newspapers did not like them. (Horowitz 2003) The tradition continues and people still keep taking loans through credit cards and many other methods. Houses are pledged repeatedly so that increase in prices can be taken advantage of. The tradition continues. Post WWI Business (New Industries): This is very difficult to say as many industries are now produced in small parts. On the other hand, in America the individuals decided to concentrate on their private life. The efforts were to separate their lives from others through building up of walls, more prestigious houses with lawns and have a large number of machines ââ¬â washing machines for clothes, refrigerators to store food and a number of stoves which could be used only for individual dishes. (Delong 1997) Thus one would say that efforts were more on showing off individuality than on development as a social group. It is clear that after the Second World War, the International position of the American economy was viewed differently by the powerful bureaucrats and politicians within the country. There was the history of two major wars, stated by other countries, which were not resolved without the interference of America, though the entire course of wars was fought on other continents. This helped America as her countryside and people were not directly ravaged. The situation was seen clearly in Germany which had lost a lot of its able-bodied men as did France. This had led to those countries allowing a large number of immigrants to come in and some problems for this are being seen in France today. Even during the balance of the twentieth century when Soviet Union collapsed, the sufferings were more directly seen in Russia and other countries. The entire planning of America also reflected this attitude and new organizations like CIA were developed to deal with educationists and others. (Arndt 2005) This shows clearly a feeling that mistakes in the area of political and social thoughts were felt to be possible to resolve through plans. Thus when America got into some wars like those in Vietnam, Korea and Iraq; the people and the culture of the country could not adjust to the losses for a long time. The realization of the fact that the war was lost took some time to sink in. At the same time, it should also be understood that for all these wars, there have been allegations that business was interested in starting some of them. As the country has changed, the policies have also changed and one of the biggest trends is now to have very large corporations. It even exists in the field of health care where there have been mergers of Hospital Corporation of America which had the highest turnover with American Hospital Supply which was the biggest supplier of goods to the hospital industry. When this took place in 1985, this was the largest merger in America by organizations outside the oil industry. (Time1985) The point to consider here is that forming of companies leads to increase in profits so that additional benefits can be paid to shareholders, but at the same time, this leads to increase in costs for the patients. Does it serve the average citizen of the country? Some other difficulties are developed by the systems within the country like requirement of governmental licenses, privileges, subsidy, law and other such advantages. On top of that, there are laws which hinder free trade like the anti-trust cases. (Wright 2002) Yet, the greed among the businessmen does not seem to stop and one of the famous cases was with regard to stock speculator, Ivan Boesky who had stated that ââ¬Å"greed is healthyâ⬠in 1985. (James 2002) Are the many changes in the attitudes of businessmen in this regard correct?
Wednesday, October 23, 2019
The silk Road recording the journey
The Silk Road was a very interesting time in life time history. The silk Road was the world's first superhighway not literally a single road it consisted of a good network of trade routes connecting China with Central Asia and lands beyond all the way to Rome. Goods were usually transported by larch Caravana it's made up of guides soldiers religious Pilgrims merchants and hundreds of fright bearing camels. The silk Road florist for more than 3000 years and had a major influence on the cultures of Asia Europe and Africa is Vinny Q ask you to become a traveler on the silk Road and to record your experiences at different points on your journey.In this mini to you will examine several documents and then write five journal entries in the voice of a fictionists person traveling the silk Road first choose the type of traveler you wish to be from the collection of district just descriptions on the following page give your traveler a name from the list below and fill out the profile of this p erson using the biological information and your historical imagination.Good Genadijs spread across to soak road for sentries this process of sharing is calling cultural diffusion by historians below are a few examples of goods and ideas that move by way of the silk Road's number one from China silk iron bronze server and mix orange trees paper gunpowder from Central Asia for Gano horses from Africa Ivory and rhinoceros horn from India spices and Buddhism from Europe music and glassware for travelers heading west the OC this town of done Wong was a place to rest and resupply before braving the Western Cody and the Telemac and deserts soon after the fall of hand dynasty.Buddhism monks Buddhist monks begin to dig caves just 10 miles outside of dung cock in many of the caves they built Buddhist shrines over the centuries these caves also became storage vaults for many items brought to Duntonge by Cellpro travelers in ancient times the Telemac and desert was sometimes referred to as the Cody today as then the temperature in the desert reaches over 100Ã °F and rainfall and minimal
Subscribe to:
Comments (Atom)