Search results for: statistical features
6793 Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information
Authors: Wei-Jong Yang, Wei-Hau Du, Pau-Choo Chang, Jar-Ferr Yang, Pi-Hsia Hung
Abstract:
The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an important feature for visual thing recognition. With color-based SIFT features and SVM, we can discard unreliable matching pairs and increase the robustness of matching tasks. The experimental results show that the proposed object recognition system with color-assistant SIFT SVM classifier achieves higher recognition rate than that with the traditional gray SIFT and SVM classification in various situations.Keywords: color moments, visual thing recognition system, SIFT, color SIFT
Procedia PDF Downloads 4726792 Surface Quality Improvement of Abrasive Waterjet Cutting for Spacecraft Structure
Authors: Tarek M. Ahmed, Ahmed S. El Mesalamy, Amro M. Youssef, Tawfik T. El Midany
Abstract:
Abrasive waterjet (AWJ) machining is considered as one of the most powerful cutting processes. It can be used for cutting heat sensitive, hard and reflective materials. Aluminum 2024 is a high-strength alloy which is widely used in aerospace and aviation industries. This paper aims to improve aluminum alloy and to investigate the effect of AWJ control parameters on surface geometry quality. Design of experiments (DoE) is used for establishing an experimental matrix. Statistical modeling is used to present a relation between the cutting parameters (pressure, speed, and distance between the nozzle and cut surface) and responses (taper angle and surface roughness). The results revealed a tangible improvement in productivity by using AWJ processing. The taper kerf angle can be improved by decreasing standoff distance and speed and increasing water pressure. While decreasing (cutting speed, pressure and distance between the nozzle and cut surface) improve the surface roughness in the operating window of cutting parameters.Keywords: abrasive waterjet machining, machining of aluminum alloy, non-traditional cutting, statistical modeling
Procedia PDF Downloads 2506791 Prevalence of Breast Cancer Molecular Subtypes at a Tertiary Cancer Institute
Authors: Nahush Modak, Meena Pangarkar, Anand Pathak, Ankita Tamhane
Abstract:
Background: Breast cancer is the prominent cause of cancer and mortality among women. This study was done to show the statistical analysis of a cohort of over 250 patients detected with breast cancer diagnosed by oncologists using Immunohistochemistry (IHC). IHC was performed by using ER; PR; HER2; Ki-67 antibodies. Materials and methods: Formalin fixed Paraffin embedded tissue samples were obtained by surgical manner and standard protocol was followed for fixation, grossing, tissue processing, embedding, cutting and IHC. The Ventana Benchmark XT machine was used for automated IHC of the samples. Antibodies used were supplied by F. Hoffmann-La Roche Ltd. Statistical analysis was performed by using SPSS for windows. Statistical tests performed were chi-squared test and Correlation tests with p<.01. The raw data was collected and provided by National Cancer Insitute, Jamtha, India. Result: Luminal B was the most prevailing molecular subtype of Breast cancer at our institute. Chi squared test of homogeneity was performed to find equality in distribution and Luminal B was the most prevalent molecular subtype. The worse prognostic indicator for breast cancer depends upon expression of Ki-67 and her2 protein in cancerous cells. Our study was done at p <.01 and significant dependence was observed. There exists no dependence of age on molecular subtype of breast cancer. Similarly, age is an independent variable while considering Ki-67 expression. Chi square test performed on Human epidermal growth factor receptor 2 (HER2) statuses of patients and strong dependence was observed in percentage of Ki-67 expression and Her2 (+/-) character which shows that, value of Ki depends upon Her2 expression in cancerous cells (p<.01). Surprisingly, dependence was observed in case of Ki-67 and Pr, at p <.01. This shows that Progesterone receptor proteins (PR) are over-expressed when there is an elevation in expression of Ki-67 protein. Conclusion: We conclude from that Luminal B is the most prevalent molecular subtype at National Cancer Institute, Jamtha, India. There was found no significant correlation between age and Ki-67 expression in any molecular subtype. And no dependence or correlation exists between patients’ age and molecular subtype. We also found that, when the diagnosis is Luminal A, out of the cohort of 257 patients, no patient shows >14% Ki-67 value. Statistically, extremely significant values were observed for dependence of PR+Her2- and PR-Her2+ scores on Ki-67 expression. (p<.01). Her2 is an important prognostic factor in breast cancer. Chi squared test for Her2 and Ki-67 shows that the expression of Ki depends upon Her2 statuses. Moreover, Ki-67 cannot be used as a standalone prognostic factor for determining breast cancer.Keywords: breast cancer molecular subtypes , correlation, immunohistochemistry, Ki-67 and HR, statistical analysis
Procedia PDF Downloads 1236790 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis
Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho
Abstract:
This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis
Procedia PDF Downloads 1856789 Characterization of Climatic Drought in the Saiss Plateau (Morocco) Using Statistical Indices
Authors: Abdeghani Qadem
Abstract:
Climate change is now an undeniable reality with increasing impacts on water systems worldwide, especially leading to severe drought episodes. The Southern Mediterranean region is particularly affected by this drought, which can have devastating consequences on water resources. Morocco, due to its geographical location in North Africa and the Southern Mediterranean, is especially vulnerable to these effects of climate change, particularly drought. In this context, this article focuses on the study of climate variability and drought characteristics in the Saiss Plateau region and its adjacent areas with the Middle Atlas, using specific statistical indices. The study begins by analyzing the annual precipitation variation, with a particular emphasis on data homogenization and gap filling using a regional vector. Then, the analysis delves into drought episodes in the region, using the Standardized Precipitation Index (SPI) over a 12-month period. The central objective is to accurately assess significant drought changes between 1980 and 2015, based on data collected from nine meteorological stations located in the study area.Keywords: climate variability, regional vector, drought, standardized precipitation index, Saiss Plateau, middle atlas
Procedia PDF Downloads 696788 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan
Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail
Abstract:
Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.Keywords: credibility, decision making, food bloggers, generation z, e-wom
Procedia PDF Downloads 746787 Histopathological Features of Infections Caused by Fusarium equiseti (Mart.) Sacc. in Onion Plants from Kebbi State, Northern Nigeria
Authors: Wadzani Dauda Palnam, Alao S. Emmanuel Laykay, Afiniki Bawa Zarafi, Olufunmilola Alabi, Dora N. Iortsuun
Abstract:
Onion production is affected by several diseases including fusariosis. Study was conducted to investigate the histopathological features of different onion tissues infected with Fusarium equiseti by inoculation with soil drench, root dip and mycelia paste methods. This was carried out by fixation, dehydration, clearing, wax embedding, sectioning, staining and mounting of leaf and root sections for microscopical examination at 400x. Once infection occurred in the roots, the pathogen moved through the vascular system to colonize the whole plant. At first, it grew in the intercellular spaces of the root cortex but soon invaded the cells, followed by colonization of the cells by its hyphae and microconidia. At later stages of infection, the cortex tissue became completely disorganized and decomposed as the pathogen advance to the shoot system via the vessel elements; this may be responsible for the early wilting symptom of infected plants arising from the severe water stress due to blockage of the xylem tissues.Keywords: onion, histopathology, infection, fusaria, inoculation
Procedia PDF Downloads 2796786 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 936785 An Investigation of Surface Water Quality in an Industrial Area Using Integrated Approaches
Authors: Priti Saha, Biswajit Paul
Abstract:
Rapid urbanization and industrialization has increased the pollution load in surface water bodies. However, these water bodies are major source of water for drinking, irrigation, industrial activities and fishery. Therefore, water quality assessment is paramount importance to evaluate its suitability for all these purposes. This study focus to evaluate the surface water quality of an industrial city in eastern India through integrating interdisciplinary techniques. The multi-purpose Water Quality Index (WQI) assess the suitability for drinking, irrigation as well as fishery of forty-eight sampling locations, where 8.33% have excellent water quality (WQI:0-25) for fishery and 10.42%, 20.83% and 45.83% have good quality (WQI:25-50), which represents its suitability for drinking irrigation and fishery respectively. However, the industrial water quality was assessed through Ryznar Stability Index (LSI), which affirmed that only 6.25% of sampling locations have neither corrosive nor scale forming properties (RSI: 6.2-6.8). Integration of these statistical analysis with geographical information system (GIS) helps in spatial assessment. It identifies of the regions where the water quality is suitable for its use in drinking, irrigation, fishery as well as industrial activities. This research demonstrates the effectiveness of statistical and GIS techniques for water quality assessment.Keywords: surface water, water quality assessment, water quality index, spatial assessment
Procedia PDF Downloads 1826784 An Appraisal of Maintenance Management Practices in Federal University Dutse and Jigawa State Polytechnic Dutse, Nigeria
Authors: Aminu Mubarak Sadis
Abstract:
This study appraised the maintenance management practice in Federal University Dutse and Jigawa State Polytechnic Dutse, in Nigeria. The Physical Planning, Works and Maintenance Departments of the two Higher Institutions (Federal University Dutse and Jigawa State Polytechnic) are responsible for production and maintenance management of their physical assets. Over–enrollment problem has been a common feature in the higher institutions in Nigeria, Data were collected by the administered questionnaires and subsequent oral interview to authenticate the completed questionnaires. Random sampling techniques was used in selecting 150 respondents across the various institutions (Federal University Dutse and Jigawa State Polytechnic Dutse). Data collected was analyzed using Statistical Package for Social Science (SPSS) and t-test statistical techniques The conclusion was that maintenance management activities are yet to be given their appropriate attention on functions of the university and polytechnic which are crucial to improving teaching, learning and research. The unit responsible for maintenance and managing facilities should focus on their stated functions and effect changes were possible.Keywords: appraisal, maintenance management, university, Polytechnic, practices
Procedia PDF Downloads 2536783 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features
Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan
Abstract:
Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.Keywords: pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction
Procedia PDF Downloads 2646782 The Concept of Neurostatistics as a Neuroscience
Authors: Igwenagu Chinelo Mercy
Abstract:
This study is on the concept of Neurostatistics in relation to neuroscience. Neuroscience also known as neurobiology is the scientific study of the nervous system. In the study of neuroscience, it has been noted that brain function and its relations to the process of acquiring knowledge and behaviour can be better explained by the use of various interrelated methods. The scope of neuroscience has broadened over time to include different approaches used to study the nervous system at different scales. On the other hand, Neurostatistics based on this study is viewed as a statistical concept that uses similar techniques of neuron mechanisms to solve some problems especially in the field of life science. This study is imperative in this era of Artificial intelligence/Machine leaning in the sense that clear understanding of the technique and its proper application could assist in solving some medical disorder that are mainly associated with the nervous system. This will also help in layman’s understanding of the technique of the nervous system in order to overcome some of the health challenges associated with it. For this concept to be well understood, an illustrative example using a brain associated disorder was used for demonstration. Structural equation modelling was adopted in the analysis. The results clearly show the link between the techniques of statistical model and nervous system. Hence, based on this study, the appropriateness of Neurostatistics application in relation to neuroscience could be based on the understanding of the behavioural pattern of both concepts.Keywords: brain, neurons, neuroscience, neurostatistics, structural equation modeling
Procedia PDF Downloads 726781 Climate Trends, Variability, and Impacts of El Niño-Southern Oscillation on Rainfall Amount in Ethiopia
Authors: Zerihun Yohannes Amare, Belayneh Birku Geremew, Nigatu Melise Kebede, Sisaynew Getahun Amera
Abstract:
In Ethiopia, agricultural production is predominantly rainfed. The El Niño Southern Oscillation (ENSO) is the driver of climate variability, which affects the agricultural production system in the country. This paper aims to study trends, variability of rainfall, and impacts of El Niño Southern Oscillation (ENSO) on rainfall amount. The study was carried out in Ethiopia's Western Amhara National Regional State, which features a variety of seasons that characterize the nation. Monthly rainfall data were collected from fifteen meteorological stations of Western Amhara. Selected El Niño and La Niña years were also extracted from National Oceanic and Atmospheric Administration (NOAA) from 1986 to 2015. Once the data quality was checked and inspected, the monthly rainfall data of the selected stations were arranged in Microsoft Excel Spreadsheet and analyzed using XLSTAT software. The coefficient of variation and the Mann-Kendall non-parametric statistical test was employed to analyze trends and variability of rainfall and temperature. The long-term recorded annual rainfall data indicated that there was an increasing trend from 1986 to 2015 insignificantly. The rainfall variability was less (Coefficient of Variation, CV = 8.6%); also, the mean monthly rainfall of Western Amhara decreased during El Niño years and increased during La Niña years, especially in the rainy season (JJAS) over 30 years. This finding will be useful to suggest possible adaptation strategies and efficient use of resources during planning and implementation.Keywords: rainfall, Mann-Kendall test, El Niño, La Niña, Western Amhara, Ethiopia
Procedia PDF Downloads 986780 Development of a Robust Procedure for Generating Structural Models of Calcium Aluminosilicate Glass Surfaces
Authors: S. Perera, T. R. Walsh, M. Solvang
Abstract:
The structure-property relationships of calcium aluminosilicate (CAS) glass surfaces are of scientific and technological interest regarding dissolution phenomena. Molecular dynamics (MD) simulations can provide atomic-scale insights into the structure and properties of the CAS interfaces in vacuo as the first step to conducting computational dissolution studies on CAS surfaces. However, one limitation to date is that although the bulk properties of CAS glasses have been well studied by MD simulation, corresponding efforts on CAS surface properties are relatively few in number (both theoretical and experimental). Here, a systematic computational protocol to create CAS surfaces in vacuo is developed by evaluating the sensitivity of the resultant surface structure with respect to different factors. Factors such as the relative thickness of the surface layer, the relative thickness of the bulk region, the cooling rate, and the annealing schedule (time and temperature) are explored. Structural features such as ring size distribution, defect concentrations (five-coordinated aluminium (AlV), non-bridging oxygen (NBO), and tri-cluster oxygen (TBO)), and linkage distribution are identified as significant features in dissolution studies.Keywords: MD simulation, CAS glasses, surface structure, structure-property, CAS interface
Procedia PDF Downloads 996779 The Trend of Injuries in Building Fire in Tehran from 2002 to 2012
Authors: Mohammadreza Ashouri, Majid Bayatian
Abstract:
Analysis of fire data is a way for the implementation of any plan to improve the level of safety in cities. Such an analysis is able to reveal signs of changes in a given period and can be used as a measure of safety. The information of about 66,341 fires (from 2002 to 2012) released by Tehran Safety Services and Fire-Fighting Organization and data on the population and the number of households provided by Tehran Municipality and the Statistical Yearbook of Iran were extracted. Using the data, the fire changes, the rate of injuries, and mortality rate were determined and analyzed. The rate of injuries and mortality rate of fires per one million population of Tehran were 59.58% and 86.12%, respectively. During the study period, the number of fires and fire stations increased by 104.38% and 102.63%, respectively. Most fires (9.21%) happened in the 4th District of Tehran. The results showed that the recorded fire data have not been systematically planned for fire prevention since one of the ways to reduce injuries caused by fires is to develop a systematic plan for necessary actions in emergency situations. To determine a reliable source for fire prevention, the stages, definitions of working processes and the cause and effect chains should be considered. Therefore, a comprehensive statistical system should be developed for reported and recorded fire data.Keywords: fire statistics, fire analysis, accident prevention, Tehran
Procedia PDF Downloads 1856778 Faster Pedestrian Recognition Using Deformable Part Models
Authors: Alessandro Preziosi, Antonio Prioletti, Luca Castangia
Abstract:
Deformable part models achieve high precision in pedestrian recognition, but all publicly available implementations are too slow for real-time applications. We implemented a deformable part model algorithm fast enough for real-time use by exploiting information about the camera position and orientation. This implementation is both faster and more precise than alternative DPM implementations. These results are obtained by computing convolutions in the frequency domain and using lookup tables to speed up feature computation. This approach is almost an order of magnitude faster than the reference DPM implementation, with no loss in precision. Knowing the position of the camera with respect to horizon it is also possible prune many hypotheses based on their size and location. The range of acceptable sizes and positions is set by looking at the statistical distribution of bounding boxes in labelled images. With this approach it is not needed to compute the entire feature pyramid: for example higher resolution features are only needed near the horizon. This results in an increase in mean average precision of 5% and an increase in speed by a factor of two. Furthermore, to reduce misdetections involving small pedestrians near the horizon, input images are supersampled near the horizon. Supersampling the image at 1.5 times the original scale, results in an increase in precision of about 4%. The implementation was tested against the public KITTI dataset, obtaining an 8% improvement in mean average precision over the best performing DPM-based method. By allowing for a small loss in precision computational time can be easily brought down to our target of 100ms per image, reaching a solution that is faster and still more precise than all publicly available DPM implementations.Keywords: autonomous vehicles, deformable part model, dpm, pedestrian detection, real time
Procedia PDF Downloads 2826777 The Policia Internacional e de Defesa do Estado 1933–1969 and Valtiollinen Poliisi 1939–1948 on Screen: Comparing and Contrasting the Images of the Political Police in Portuguese and Finnish Films between the 1930s and the 1960s
Authors: Riikka Elina Kallio
Abstract:
“The walls have ears” phrase is defining the era of dictatorship in Portugal (1926–1974) and political unrest decades in Finland (1917–1948). The phrase is referring to the policing of the political, secret police, PIDE (Policia Internacional e de Defesa do Estado 1933–1969) in Portugal and VALPO (Valtiollinen Poliisi 1939–1948) in Finland. Free speech at any public space and even in private events could be fatal. The members of the PIDE/VALPO or informers/collaborators could be listening. Strict censorship under the Salazar´s regime was controlling media for example newspapers, music, and the film industry. Similarly, the politically affected censorship influenced the media in Finland in those unrest decades. This article examines the similarities and the differences in the images of the political police in Finland and Portugal, by analyzing Finnish and Portuguese films from the nineteen-thirties to nineteensixties. The text addresses two main research questions: what are the common and different features in the representations of the Finnish and Portuguese political police in films between the 1930s and 1960s, and how did the national censorship affect these representations? This study approach is interdisciplinary, and it combines film studies and criminology. Close reading is a practical qualitative method for analyzing films and in this study, close reading emphasizes the features of the police officer. Criminology provides the methodological tools for analysis of the police universal features and European common policies. The characterization of the police in this study is based on Robert Reiner´s 1980s and Timo Korander´s 2010s definitions of the police officer. The research material consisted of the Portuguese films from online film archives and Finnish films from Movie Making Finland -project´s metadata which offered suitable material by data mining the keywords such as poliisi, poliisipäällikkö and konstaapeli (police, police chief, police constable). The findings of this study suggest that even though there are common features of the images of the political police in Finland and Portugal, there are still national and cultural differences in the representations of the political police and policing.Keywords: censorship, film studies, images, PIDE, political police, VALPO
Procedia PDF Downloads 746776 Analysing Modern City Heritage through Modernization Transformation: A Case of Wuhan, China
Authors: Ziwei Guo, Liangping Hong, Zhiguo Ye
Abstract:
The exogenous modernization process in China and other late-coming countries, is not resulted from a gradual growth of their own modernity features, but a conscious response to external challenges. Under this context, it had been equally important for Chinese cities to make themselves ‘Chinese’ as well as ‘modern’. Wuhan was the first opened inland treaty port in late Qing Dynasty. In the following one hundred years, Wuhan transferred from a feudal town to a modern industrial city. It is a good example to illustrate the urban construction and cultural heritage through the process and impact of social transformation. An overall perspective on transformation will contribute to develop the city`s uniqueness and enhance its inclusive development. The study chooses the history of Wuhan from 1861 to 1957 as the study period. The whole transformation process will be divided into four typical periods based on key historical events, and the paper analyzes the changes on urban structure and constructions activities in each period. Then, a lot of examples are used to compare the features of Wuhan modern city heritage in the four periods. In this way, three characteristics of Wuhan modern city heritage are summarized. The paper finds that globalization and localization worked together to shape the urban physical space environment. For Wuhan, social transformation has a profound and comprehensive impact on urban construction, which can be analyzed in the aspects of main construction, architecture style, location and actors. Moreover, the three towns of Wuhan have a disparate cityscape that is reflected by the varied heritages and architecture features over different transformation periods. Lastly, the protection regulations and conservation planning of heritage in Wuhan are discussed, and suggestions about the conservation of Wuhan modern heritage are tried to be drawn. The implications of the study are providing a new perspective on modern city heritage for cities like Wuhan, and the future local planning system and heritage conservation policies can take into consideration the ‘Modern Cultural Transformation Route’ in this paper.Keywords: modern city heritage, transformation, identity, Wuhan
Procedia PDF Downloads 1326775 Reduction of Defects Using Seven Quality Control Tools for Productivity Improvement at Automobile Company
Authors: Abdul Sattar Jamali, Imdad Ali Memon, Maqsood Ahmed Memon
Abstract:
Quality of production near to zero defects is an objective of every manufacturing and service organization. In order to maintain and improve the quality by reduction in defects, Statistical tools are being used by any organizations. There are many statistical tools are available to assess the quality. Keeping in view the importance of many statistical tools, traditional 7QC tools has been used in any manufacturing and automobile Industry. Therefore, the 7QC tools have been successfully applied at one of the Automobile Company Pakistan. Preliminary survey has been done for the implementation of 7QC tool in the assembly line of Automobile Industry. During preliminary survey two inspection points were decided to collect the data, which are Chassis line and trim line. The data for defects at Chassis line and trim line were collected for reduction in defects which ultimately improve productivity. Every 7QC tools has its benefits observed from the results. The flow charts developed for better understanding about inspection point for data collection. The check sheets developed for helps for defects data collection. Histogram represents the severity level of defects. Pareto charts show the cumulative effect of defects. The Cause and Effect diagrams developed for finding the root causes of each defects. Scatter diagram developed the relation of defects increasing or decreasing. The P-Control charts developed for showing out of control points beyond the limits for corrective actions. The successful implementation of 7QC tools at the inspection points at Automobile Industry concluded that the considerable amount of reduction on defects level, as in Chassis line from 132 defects to 13 defects. The total 90% defects were reduced in Chassis Line. In Trim line defects were reduced from 157 defects to 28 defects. The total 82% defects were reduced in Trim Line. As the Automobile Company exercised only few of the 7 QC tools, not fully getting the fruits by the application of 7 QC tools. Therefore, it is suggested the company may need to manage a mechanism for the application of 7 QC tools at every section.Keywords: check sheet, cause and effect diagram, control chart, histogram
Procedia PDF Downloads 3276774 Improving Security Features of Traditional Automated Teller Machines-Based Banking Services via Fingerprint Biometrics Scheme
Authors: Anthony I. Otuonye, Juliet N. Odii, Perpetual N. Ibe
Abstract:
The obvious challenges faced by most commercial bank customers while using the services of ATMs (Automated Teller Machines) across developing countries have triggered the need for an improved system with better security features. Current ATM systems are password-based, and research has proved the vulnerabilities of these systems to heinous attacks and manipulations. We have discovered by research that the security of current ATM-assisted banking services in most developing countries of the world is easily broken and maneuvered by fraudsters, majorly because it is quite difficult for these systems to identify an impostor with privileged access as against the authentic bank account owner. Again, PIN (Personal Identification Number) code passwords are easily guessed, just to mention a few of such obvious limitations of traditional ATM operations. In this research work also, we have developed a system of fingerprint biometrics with PIN code Authentication that seeks to improve the security features of traditional ATM installations as well as other Banking Services. The aim is to ensure better security at all ATM installations and raise the confidence of bank customers. It is hoped that our system will overcome most of the challenges of the current password-based ATM operation if properly applied. The researchers made use of the OOADM (Object-Oriented Analysis and Design Methodology), a software development methodology that assures proper system design using modern design diagrams. Implementation and coding were carried out using Visual Studio 2010 together with other software tools. Results obtained show a working system that provides two levels of security at the client’s side using a fingerprint biometric scheme combined with the existing 4-digit PIN code to guarantee the confidence of bank customers across developing countries.Keywords: fingerprint biometrics, banking operations, verification, ATMs, PIN code
Procedia PDF Downloads 466773 Statistical Analysis of the Impact of Maritime Transport Gross Domestic Product (GDP) on Nigeria’s Economy
Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi
Abstract:
Nigeria is referred as the ‘Giant of Africa’ due to high population, land mass and large economy. However, it still trails far behind many smaller economies in the continent in terms of maritime operations. As we have seen that the maritime industry is the spark plug for national growth, because it houses the most crucial infrastructure that generates wealth for a nation, it is worrisome that a nation with six seaports lag in maritime activities. In this research, we have studied how the Gross Domestic Product (GDP) of the maritime transport influences the Nigerian economy. To do this, we applied Simple Linear Regression (SLR), Support Vector Machine (SVM), Polynomial Regression Model (PRM), Generalized Additive Model (GAM) and Generalized Linear Mixed Model (GLMM) to model the relationship between the nation’s Total GDP (TGDP) and the Maritime Transport GDP (MGDP) using a time series data of 20 years. The result showed that the MGDP is statistically significant to the Nigerian economy. Amongst the statistical tool applied, the PRM of order 4 describes the relationship better when compared to other methods. The recommendations presented in this study will guide policy makers and help improve the economy of Nigeria in terms of its GDP.Keywords: maritime transport, economy, GDP, regression, port
Procedia PDF Downloads 1556772 A Statistical Approach to Air Pollution in Mexico City and It's Impacts on Well-Being
Authors: Ana B. Carrera-Aguilar , Rodrigo T. Sepulveda-Hirose, Diego A. Bernal-Gurrusquieta, Francisco A. Ramirez Casas
Abstract:
In recent years, Mexico City has presented high levels of atmospheric pollution; the city is also an example of inequality and poverty that impact metropolitan areas around the world. This combination of social and economic exclusion, coupled with high levels of pollution evidence the loss of well-being among the population. The effect of air pollution on quality of life is an area of study that has been overlooked. The purpose of this study is to find relations between air quality and quality of life in Mexico City through statistical analysis of a regression model and principal component analysis of several atmospheric contaminants (CO, NO₂, ozone, particulate matter, SO₂) and well-being indexes (HDI, poverty, inequality, life expectancy and health care index). The data correspond to official information (INEGI, SEDEMA, and CEPAL) for 2000-2018. Preliminary results show that the Human Development Index (HDI) is affected by the impacts of pollution, and its indicators are reduced in the presence of contaminants. It is necessary to promote a strong interest in this issue in Mexico City. Otherwise, the problem will not only remain but will worsen affecting those who have less and the population well-being in a generalized way.Keywords: air quality, Mexico City, quality of life, statistics
Procedia PDF Downloads 1446771 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging
Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.Keywords: breast, machine learning, MRI, radiomics
Procedia PDF Downloads 2696770 Neuroanatomical Specificity in Reporting & Diagnosing Neurolinguistic Disorders: A Functional & Ethical Primer
Authors: Ruairi J. McMillan
Abstract:
Introduction: This critical analysis aims to ascertain how well neuroanatomical aetiologies are communicated within 20 case reports of aphasia. Neuroanatomical visualisations based on dissected brain specimens were produced and combined with white matter tract and vascular taxonomies of function in order to address the most consistently underreported features found within the aphasic case study reports. Together, these approaches are intended to integrate aphasiological knowledge from the past 20 years with aphasiological diagnostics, and to act as prototypal resources for both researchers and clinical professionals. The medico-legal precedent for aphasia diagnostics under Canadian, US and UK case law and the neuroimaging/neurological diagnostics relative to the functional capacity of aphasic patients are discussed in relation to the major findings of the literary analysis, neuroimaging protocols in clinical use today, and the neuroanatomical aetiologies of different aphasias. Basic Methodology: Literature searches of relevant scientific databases (e.g, OVID medline) were carried out using search terms such as aphasia case study (year) & stroke induced aphasia case study. A series of 7 diagnostic reporting criteria were formulated, and the resulting case studies were scored / 7 alongside clinical stroke criteria. In order to focus on the diagnostic assessment of the patient’s condition, only the case report proper (not the discussion) was used to quantify results. Statistical testing established if specific reporting criteria were associated with higher overall scores and potentially inferable increases in quality of reporting. Statistical testing of whether criteria scores were associated with an unclear/adjusted diagnosis were also tested, as well as the probability of a given criterion deviating from an expected estimate. Major Findings: The quantitative analysis of neuroanatomically driven diagnostics in case studies of aphasia revealed particularly low scores in the connection of neuroanatomical functions to aphasiological assessment (10%), and in the inclusion of white matter tracts within neuroimaging or assessment diagnostics (30%). Case studies which included clinical mention of white matter tracts within the report itself were distributed among higher scoring cases, as were case studies which (as clinically indicated) related the affected vascular region to the brain parenchyma of the language network. Concluding Statement: These findings indicate that certain neuroanatomical functions are integrated less often within the patient report than others, despite a precedent for well-integrated neuroanatomical aphasiology also being found among the case studies sampled, and despite these functions being clinically essential in diagnostic neuroimaging and aphasiological assessment. Therefore, ultimately the integration and specificity of aetiological neuroanatomy may contribute positively to the capacity and autonomy of aphasic patients as well as their clinicians. The integration of a full aetiological neuroanatomy within the reporting of aphasias may improve patient outcomes and sustain autonomy in the event of medico-ethical investigation.Keywords: aphasia, language network, functional neuroanatomy, aphasiological diagnostics, medico-legal ethics
Procedia PDF Downloads 676769 Development of a Real-Time Brain-Computer Interface for Interactive Robot Therapy: An Exploration of EEG and EMG Features during Hypnosis
Authors: Maryam Alimardani, Kazuo Hiraki
Abstract:
This study presents a framework for development of a new generation of therapy robots that can interact with users by monitoring their physiological and mental states. Here, we focused on one of the controversial methods of therapy, hypnotherapy. Hypnosis has shown to be useful in treatment of many clinical conditions. But, even for healthy people, it can be used as an effective technique for relaxation or enhancement of memory and concentration. Our aim is to develop a robot that collects information about user’s mental and physical states using electroencephalogram (EEG) and electromyography (EMG) signals and performs costeffective hypnosis at the comfort of user’s house. The presented framework consists of three main steps: (1) Find the EEG-correlates of mind state before, during, and after hypnosis and establish a cognitive model for state changes, (2) Develop a system that can track the changes in EEG and EMG activities in real time and determines if the user is ready for suggestion, and (3) Implement our system in a humanoid robot that will talk and conduct hypnosis on users based on their mental states. This paper presents a pilot study in regard to the first stage, detection of EEG and EMG features during hypnosis.Keywords: hypnosis, EEG, robotherapy, brain-computer interface (BCI)
Procedia PDF Downloads 2586768 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission
Authors: Tingwei Shu, Dong Zhou, Chengjun Guo
Abstract:
Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.Keywords: semantic communication, transformer, wavelet transform, data processing
Procedia PDF Downloads 806767 Analysis of Organizational Factors Effect on Performing Electronic Commerce Strategy: A Case Study of the Namakin Food Industry
Authors: Seyed Hamidreza Hejazi Dehghani, Neda Khounsari
Abstract:
Quick growth of electronic commerce in developed countries means that developing nations must change in their commerce strategies fundamentally. Most organizations are aware of the impact of the Internet and e-Commerce on the future of their firm, and thus, they have to focus on organizational factors that have an effect on the deployment of an e-Commerce strategy. In this situation, it is essential to identify organizational factors such as the organizational culture, human resources, size, structure and product/service that impact an e-commerce strategy. Accordingly, this research specifies the effects of organizational factors on applying an e-commerce strategy in the Namakin food industry. The statistical population of this research is 95 managers and employees. Cochran's formula is used for determination of the sample size that is 77 of the statistical population. Also, SPSS and Smart PLS software were utilized for analyzing the collected data. The results of hypothesis testing show that organizational factors have positive and significant effects of applying an e-Commerce strategy. On the other hand, sub-hypothesizes show that effectiveness of the organizational culture and size criteria were rejected and other sub-hypothesis were accepted.Keywords: electronic commerce, organizational factors, attitude of managers, organizational readiness
Procedia PDF Downloads 2826766 Juxtaposition of the Past and the Present: A Pragmatic Stylistic Analysis of the Short Story “Too Much Happiness” by Alice Munro
Authors: Inas Hussein
Abstract:
Alice Munro is a Canadian short-story writer who has been regarded as one of the greatest writers of fiction. Owing to her great contribution to fiction, she was the first Canadian woman and the only short-story writer ever to be rewarded the Nobel Prize for Literature in 2013. Her literary works include collections of short stories and one book published as a novel. Her stories concentrate on the human condition and the human relationships as seen through the lens of daily life. The setting in most of her stories is her native Canada- small towns much similar to the one where she grew up. Her writing style is not only realistic but is also characterized by autobiographical, historical and regional features. The aim of this research is to analyze one of the key stylistic devices often adopted by Munro in her fictions: the juxtaposition of the past and the present, with reference to the title story in Munro's short story collection Too Much Happiness. The story under exploration is a brief biography of the Russian Mathematician and novelist Sophia Kovalevsky (1850 – 1891), the first woman to be appointed as a professor of Mathematics at a European University in Stockholm. Thus, the story has a historical protagonist and is set on the European continent. Munro dramatizes the severe historical and cultural constraints that hindered the career of the protagonist. A pragmatic stylistic framework is being adopted and the qualitative analysis is supported by textual reference. The stylistic analysis reveals that the juxtaposition of the past and the present is one of the distinctive features that characterize the author; in a typical Munrovian manner, the protagonist often moves between the units of time: the past, the present and, sometimes, the future. Munro's style is simple and direct but cleverly constructed and densely complicated by the presence of deeper layers and stories within the story. Findings of the research reveal that the story under investigation merits reading and analyzing. It is recommended that this story and other stories by Munro are analyzed to further explore the features of her art and style.Keywords: Alice Munro, Too Much Happiness, style, stylistic analysis
Procedia PDF Downloads 1466765 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014
Authors: Alexiou Dimitra, Fragkaki Maria
Abstract:
The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics
Procedia PDF Downloads 5136764 The Factors Affecting the Operations of the Industrial Enterprises of Cassava in the Northeast of Thailand
Authors: Thanasuwit Thabhiranrak
Abstract:
This research aims to study factors that affected the operations of the cassava industrial enterprises in northeast of Thailand. Hypothesis was tested by regress analysis and also the analysis in order to determine the relationship between variables with Pearson correlation and show a class action in cassava process including the owner of business executives and supervisors. The research samples were 400 people in northeast region of Thailand. The research results revealed that success of entrepreneurs related to transformation leadership and knowledge management in a positive way at statistical significance level of 0.01 and respondents also emphasized on the importance of transformational leadership factors. The individual and the use of intelligence affect the success of entrepreneurs in cassava industry at statistical significance level of 0.05. The qualitative data were also collected by interviewing with operational level staff, supervisors, executives, and enterprise owners in the northeast of Thailand. The result was found that knowledge management was important in their business operations. Personnel in the organizations should learn from working experience, develop their skills, and increase knowledge from education.Keywords: transformational leadership, knowledge management (KM), cassava, northeast of Thailand, industrial
Procedia PDF Downloads 305