Search results for: closest facility analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28374

Search results for: closest facility analysis

27024 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 191
27023 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB

Procedia PDF Downloads 144
27022 Research of the Three-Dimensional Visualization Geological Modeling of Mine Based on Surpac

Authors: Honggang Qu, Yong Xu, Rongmei Liu, Zhenji Gao, Bin Wang

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three-dimensional visualization geological modeling of mine is the digital characterization of mineral deposits and is one of the key technology of digital mining. Three-dimensional geological modeling is a technology that combines geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in a three-dimensional environment with computer technology and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between the distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provides scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 77
27021 Accessibility and Visibility through Space Syntax Analysis of the Linga Raj Temple in Odisha, India

Authors: S. Pramanik

Abstract:

Since the early ages, the Hindu temples have been interpreted through various Vedic philosophies. These temples are visited by pilgrims which demonstrate the rituals and religious belief of communities, reflecting a variety of actions and behaviors. Darsana a direct seeing, is a part of the pilgrimage activity. During the process of Darsana, a devotee is prepared for entry in the temple to realize the cognizing Truth culminating in visualizing the idol of God, placed at the Garbhagriha (sanctum sanctorum). For this, the pilgrim must pass through a sequential arrangement of spaces. During the process of progress, the pilgrims visualize the spaces differently from various points of views. The viewpoints create a variety of spatial patterns in the minds of pilgrims coherent to the Hindu philosophies. The space organization and its order are perceived by various techniques of spatial analysis. A temple, as examples of Kalinga stylistic variations, has been chosen for the study. This paper intends to demonstrate some visual patterns generated during the process of Darsana (visibility) and its accessibility by Point Isovist Studies and Visibility Graph Analysis from the entrance (Simha Dwara) to The Sanctum sanctorum (Garbhagriha).

Keywords: Hindu temple architecture, point isovist, space syntax analysis, visibility graph analysis

Procedia PDF Downloads 120
27020 Establishment of Diagnostic Reference Levels for Computed Tomography Examination at the University of Ghana Medical Centre

Authors: Shirazu Issahaku, Isaac Kwesi Acquah, Simon Mensah Amoh, George Nunoo

Abstract:

Introduction: Diagnostic Reference Levels are important indicators for monitoring and optimizing protocol and procedure in medical imaging between facilities and equipment. This helps to evaluate whether, in routine clinical conditions, the median value obtained for a representative group of patients within an agreed range from a specified procedure is unusually high or low for that procedure. This study aimed to propose Diagnostic Reference Levels for Computed Tomography examination of the most common routine examination of the head, chest and abdominal pelvis regions at the University of Ghana Medical Centre. Methods: The Diagnostic Reference Levels were determined based on the investigation of the most common routine examinations, including head Computed Tomography examination with and without contrast, abdominopelvic Computed Tomography examination with and without contrast, and chest Computed Tomography examination without contrast. The study was based on two dose indicators: the volumetric Computed Tomography Dose Index and Dose-Length Product. Results: The estimated median distribution for head Computed Tomography with contrast for volumetric-Computed Tomography dose index and Dose-Length Product were 38.33 mGy and 829.35 mGy.cm, while without contrast, were 38.90 mGy and 860.90 mGy.cm respectively. For an abdominopelvic Computed Tomography examination with contrast, the estimated volumetric-Computed Tomography dose index and Dose-Length Product values were 40.19 mGy and 2096.60 mGy.cm. In the absence of contrast, the calculated values were 14.65 mGy and 800.40 mGy.cm, respectively. Additionally, for chest Computed Tomography examination, the estimated values were 12.75 mGy and 423.95 mGy.cm for volumetric-Computed Tomography dose index and Dose-Length Product, respectively. These median values represent the proposed diagnostic reference values of the head, chest, and abdominal pelvis regions. Conclusions: The proposed Diagnostic Reference Level is comparable to the recommended International Atomic Energy Agency and International Commission Radiation Protection Publication 135 and other regional published data by the European Commission and Regional National Diagnostic Reference Level in Africa. These reference levels will serve as benchmarks to guide clinicians in optimizing radiation dose levels while ensuring accurate diagnostic image quality at the facility.

Keywords: diagnostic reference levels, computed tomography dose index, computed tomography, radiation exposure, dose-length product, radiation protection

Procedia PDF Downloads 50
27019 Tracing the Evolution of English and Urdu Languages: A Linguistic and Cultural Analysis

Authors: Aamna Zafar

Abstract:

Through linguistic and cultural analysis, this study seeks to trace the development of the English and Urdu languages. Along with examining how the vocabulary and syntax of English and Urdu have evolved over time and the linguistic trends that may be seen in these changes, this study will also look at the historical and cultural influences that have shaped the languages throughout time. The study will also look at how English and Urdu have changed over time, both in terms of language use and communication inside each other's cultures and globally. We'll research how these changes affect social relations and cultural identity, as well as how they might affect the future of these languages.

Keywords: linguistic and cultural analysis, historical factors, cultural factors, vocabulary, syntax, significance

Procedia PDF Downloads 75
27018 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 65
27017 Simplifying Seismic Vulnerability Analysis for Existing Reinforced Concrete Buildings

Authors: Maryam Solgi, Behzad Shahmohammadi, Morteza Raissi Dehkordi

Abstract:

One of the main steps for seismic retrofitting of buildings is to determine the vulnerability of structures. While current procedures for evaluating existing buildings are complicated, and there is no limitation between short, middle-high, and tall buildings. This research utilizes a simplified method for assessing structures, which is adequate for existing reinforced concrete buildings. To approach this aim, Simple Lateral Mechanisms Analysis (SLaMA) procedure proposed by NZSEE (New Zealand Society for Earthquake Engineering) has been carried out. In this study, three RC moment-resisting frame buildings are determined. First, these buildings have been evaluated by inelastic static procedure (Pushover) based on acceptance criteria. Then, Park-Ang Damage Index is determined for the whole members of each building by Inelastic Time History Analysis. Next, the Simple Lateral Mechanisms Analysis procedure, a hand method, is carried out to define the capacity of structures. Ultimately, existing procedures are compared with Peak Ground Acceleration caused to fail (PGAfail). The results of this comparison emphasize that the Pushover procedure and SLaMA method define a greater value of PGAfail than the Park-Ang Damage model.

Keywords: peak ground acceleration caused to fail, reinforced concrete moment-frame buildings, seismic vulnerability analysis, simple lateral mechanisms analysis

Procedia PDF Downloads 93
27016 Effect of Aryl Imidazolium Ionic Liquids as Asphaltene Dispersants

Authors: Raghda Ahmed El-Nagar

Abstract:

Oil spills are one of the most serious environmental issues that have occurred during the production and transportation of petroleum crude oil. Chemical asphaltene dispersants are hazardous to the marine environment, so Ionic liquids (ILs) as asphaltene dispersants are a critical area of study. In this work, different aryl imidazolium ionic liquids were synthesized with high yield and elucidated via tools of analysis (Elemental analysis, FT-IR, and 1H-NMR). Thermogravimetric analysis confirmed that the prepared ILs posses high thermal stability. The critical micelle concentration (CMC), surface tension, and emulsification index were investigated. Evaluation of synthesized ILs as asphaltene dispersants were assessed at various concentrations, and data reveals high dispersion efficiency.

Keywords: ionic liquids, oil spill, asphaltene dispersants, CMC, efficiency

Procedia PDF Downloads 194
27015 Optimization of Element Type for FE Model and Verification of Analyses with Physical Tests

Authors: Mustafa Tufekci, Caner Guven

Abstract:

In Automotive Industry, sliding door systems that are also used as body closures, are safety members. Extreme product tests are realized to prevent failures in a design process, but these tests realized experimentally result in high costs. Finite element analysis is an effective tool used for the design process. These analyses are used before production of a prototype for validation of design according to customer requirement. In result of this, the substantial amount of time and cost is saved. Finite element model is created for geometries that are designed in 3D CAD programs. Different element types as bar, shell and solid, can be used for creating mesh model. The cheaper model can be created by the selection of element type, but combination of element type that was used in model, number and geometry of element and degrees of freedom affects the analysis result. Sliding door system is a good example which used these methods for this study. Structural analysis was realized for sliding door mechanism by using FE models. As well, physical tests that have same boundary conditions with FE models were realized. Comparison study for these element types, were done regarding test and analyses results then the optimum combination was achieved.

Keywords: finite element analysis, sliding door mechanism, element type, structural analysis

Procedia PDF Downloads 329
27014 The Safety Related Functions of The Engineered Barriers of the IAEA Borehole Disposal System: The Ghana Pilot Project

Authors: Paul Essel, Eric T. Glover, Gustav Gbeddy, Yaw Adjei-Kyereme, Abdallah M. A. Dawood, Evans M. Ameho, Emmanuel A. Aberikae

Abstract:

Radioactive materials mainly in the form of Sealed Radioactive Sources are being used in various sectors (medicine, agriculture, industry, research, and teaching) for the socio-economic development of Ghana. The use of these beneficial radioactive materials has resulted in an inventory of Disused Sealed Radioactive Sources (DSRS) in storage. Most of the DSRS are legacy/historic sources which cannot be returned to their manufacturer or country of origin. Though small in volume, DSRS can be intensively radioactive and create a significant safety and security liability. They need to be managed in a safe and secure manner in accordance with the fundamental safety objective. The Radioactive Waste Management Center (RWMC) of the Ghana Atomic Energy Commission (GAEC) is currently storing a significant volume of DSRS. The initial activities of the DSRS range from 7.4E+5 Bq to 6.85E+14 Bq. If not managed properly, such DSRS can represent a potential hazard to human health and the environment. Storage is an important interim step, especially for DSRS containing very short-lived radionuclides, which can decay to exemption levels within a few years. Long-term storage, however, is considered an unsustainable option for DSRS with long half-lives hence the need for a disposal facility. The GAEC intends to use the International Atomic Energy Agency’s (IAEA’s) Borehole Disposal System (BDS) to provide a safe, secure, and cost-effective disposal option to dispose of its DSRS in storage. The proposed site for implementation of the BDS is on the GAEC premises at Kwabenya. The site has been characterized to gain a general understanding in terms of its regional setting, its past evolution and likely future natural evolution over the assessment time frame. Due to the long half-lives of some of the radionuclides to be disposed of (Ra-226 with half-life of 1600 years), the engineered barriers of the system must be robust to contain these radionuclides for this long period before they decay to harmless levels. There is the need to assess the safety related functions of the engineered barriers of this disposal system.

Keywords: radionuclides, disposal, radioactive waste, engineered barrier

Procedia PDF Downloads 82
27013 Flashover Detection Algorithm Based on Mother Function

Authors: John A. Morales, Guillermo Guidi, B. M. Keune

Abstract:

Electric Power supply is a crucial topic for economic and social development. Power outages statistics show that discharges atmospherics are imperative phenomena to produce those outages. In this context, it is necessary to correctly detect when overhead line insulators are faulted. In this paper, an algorithm to detect if a lightning stroke generates or not permanent fault on insulator strings is proposed. On top of that, lightning stroke simulations developed by using the Alternative Transients Program, are used. Based on these insights, a novel approach is designed that depends on mother functions analysis corresponding to the given variance-covariance matrix. Signals registered at the insulator string are projected on corresponding axes by the means of Principal Component Analysis. By exploiting these new axes, it is possible to determine a flashover characteristic zone useful to a good insulation design. The proposed methodology for flashover detection extends the existing approaches for the analysis and study of lightning performance on transmission lines.

Keywords: mother function, outages, lightning, sensitivity analysis

Procedia PDF Downloads 587
27012 Application of Artificial Neural Network in Assessing Fill Slope Stability

Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung

Abstract:

This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.

Keywords: landslide, limit analysis, artificial neural network, soil properties

Procedia PDF Downloads 207
27011 Advancement of Computer Science Research in Nigeria: A Bibliometric Analysis of the Past Three Decades

Authors: Temidayo O. Omotehinwa, David O. Oyewola, Friday J. Agbo

Abstract:

This study aims to gather a proper perspective of the development landscape of Computer Science research in Nigeria. Therefore, a bibliometric analysis of 4,333 bibliographic records of Computer Science research in Nigeria in the last 31 years (1991-2021) was carried out. The bibliographic data were extracted from the Scopus database and analyzed using VOSviewer and the bibliometrix R package through the biblioshiny web interface. The findings of this study revealed that Computer Science research in Nigeria has a growth rate of 24.19%. The most developed and well-studied research areas in the Computer Science field in Nigeria are machine learning, data mining, and deep learning. The social structure analysis result revealed that there is a need for improved international collaborations. Sparsely established collaborations are largely influenced by geographic proximity. The funding analysis result showed that Computer Science research in Nigeria is under-funded. The findings of this study will be useful for researchers conducting Computer Science related research. Experts can gain insights into how to develop a strategic framework that will advance the field in a more impactful manner. Government agencies and policymakers can also utilize the outcome of this research to develop strategies for improved funding for Computer Science research.

Keywords: bibliometric analysis, biblioshiny, computer science, Nigeria, science mapping

Procedia PDF Downloads 112
27010 Experiment and Analytical Study on Fire Resistance Performance of Slot Type Concrete-Filled Tube

Authors: Bum Yean Cho, Heung-Youl Kim, Ki-Seok Kwon, Kang-Su Kim

Abstract:

In this study, a full-scale test and analysis (numerical analysis) of fire resistance performance of bare CFT column on which slot was used instead of existing welding method to connect the steel pipe on the concrete-filled tube were conducted. Welded CFT column is known to be vulnerable to high or low temperature because of low brittleness of welding part. As a result of a fire resistance performance test of slot CFT column after removing the welding part and fixing it by a slot which was folded into the tube, slot type CFT column indicated the improved fire resistance performance than welded CFT column by 28% or more. And as a result of conducting finite element analysis of slot type column using ABAQUS, analysis result proved the reliability of the test result in predicting the fire behavior and fire resistance hour.

Keywords: CFT (concrete-filled tube) column, fire resistance performance, slot, weld

Procedia PDF Downloads 184
27009 Decision Support Tool for Green Roofs Selection: A Multicriteria Analysis

Authors: I. Teotónio, C.O. Cruz, C.M. Silva, M. Manso

Abstract:

Diverse stakeholders show different concerns when choosing green roof systems. Also, green roof solutions vary in their cost and performance. Therefore, decision-makers continually face the difficult task of balancing benefits against green roofs costs. Decision analysis methods, as multicriteria analysis, can be used when the decision‑making process includes different perspectives, multiple objectives, and uncertainty. The present study adopts a multicriteria decision model to evaluate the installation of green roofs in buildings, determining the solution with the best trade-off between costs and benefits in agreement with the preferences of the users/investors. This methodology was applied to a real decision problem, assessing the preferences between different green roof systems in an existing building in Lisbon. This approach supports the decision-making process on green roofs and enables robust and informed decisions on urban planning while optimizing buildings retrofitting.

Keywords: decision making, green roofs, investors preferences, multicriteria analysis, sustainable development

Procedia PDF Downloads 184
27008 A Study of the Relationship between Time Management Behaviour and Job Satisfaction of Higher Education Institutes in India

Authors: Sania K. Rao, Feza T. Azmi

Abstract:

The purpose of the present study is to explore the relationship between time management behaviour and job satisfaction of academicians of higher education institutes in India. The analyses of this study were carried out with AMOS (version 20.0); and Confirmatory Factor Analysis (CFA) and Structural Equation Modelling (SEM) were conducted. The factor analysis and findings show that perceived control of time serves as the partial mediating factor to have a significant and positive influence on job satisfaction. Further, at the end, a number of suggestions to improve one’s time management behaviour were provided.

Keywords: time management behaviour, job satisfaction, higher education, India, mediation analysis

Procedia PDF Downloads 389
27007 Predicting Wealth Status of Households Using Ensemble Machine Learning Algorithms

Authors: Habtamu Ayenew Asegie

Abstract:

Wealth, as opposed to income or consumption, implies a more stable and permanent status. Due to natural and human-made difficulties, households' economies will be diminished, and their well-being will fall into trouble. Hence, governments and humanitarian agencies offer considerable resources for poverty and malnutrition reduction efforts. One key factor in the effectiveness of such efforts is the accuracy with which low-income or poor populations can be identified. As a result, this study aims to predict a household’s wealth status using ensemble Machine learning (ML) algorithms. In this study, design science research methodology (DSRM) is employed, and four ML algorithms, Random Forest (RF), Adaptive Boosting (AdaBoost), Light Gradient Boosted Machine (LightGBM), and Extreme Gradient Boosting (XGBoost), have been used to train models. The Ethiopian Demographic and Health Survey (EDHS) dataset is accessed for this purpose from the Central Statistical Agency (CSA)'s database. Various data pre-processing techniques were employed, and the model training has been conducted using the scikit learn Python library functions. Model evaluation is executed using various metrics like Accuracy, Precision, Recall, F1-score, area under curve-the receiver operating characteristics (AUC-ROC), and subjective evaluations of domain experts. An optimal subset of hyper-parameters for the algorithms was selected through the grid search function for the best prediction. The RF model has performed better than the rest of the algorithms by achieving an accuracy of 96.06% and is better suited as a solution model for our purpose. Following RF, LightGBM, XGBoost, and AdaBoost algorithms have an accuracy of 91.53%, 88.44%, and 58.55%, respectively. The findings suggest that some of the features like ‘Age of household head’, ‘Total children ever born’ in a family, ‘Main roof material’ of their house, ‘Region’ they lived in, whether a household uses ‘Electricity’ or not, and ‘Type of toilet facility’ of a household are determinant factors to be a focal point for economic policymakers. The determinant risk factors, extracted rules, and designed artifact achieved 82.28% of the domain expert’s evaluation. Overall, the study shows ML techniques are effective in predicting the wealth status of households.

Keywords: ensemble machine learning, households wealth status, predictive model, wealth status prediction

Procedia PDF Downloads 38
27006 Boundary Conditions for 2D Site Response Analysis in OpenSees

Authors: M. Eskandarighadi, C. R. McGann

Abstract:

It is observed from past experiences of earthquakes that local site conditions can significantly affect the strong ground motion characteristicssuch as frequency content, amplitude, and duration of seismic waves. The most common method for investigating site response is one-dimensional seismic site response analysis. The infinite horizontal length of the model and the homogeneous characteristic of the soil are crucial assumptions of this method. One boundary condition that can be used in the sides is tying the sides horizontally for vertical 1D wave propagation. However, 1D analysis cannot account for the 2D nature of wave propagation in the condition where the soil profile is not fully horizontal or has heterogeneity within layers. Therefore, 2D seismic site response analysis can be used to take all of these limitations into account for a better understanding of local site conditions. Different types of boundary conditions can be appliedin 2D site response models, such as tied boundary condition, massive columns, and free-field boundary condition. The tied boundary condition has been used in 1D analysis, which is useful for 1D wave propagation. Employing two massive columns at the sides is another approach for capturing the 2D nature of wave propagation. Free-field boundary condition can simulate the free-field motion that would exist far from the domain of interest. The goal for free-field boundary condition is to minimize the unwanted reflection from sides. This research focuses on the comparison between these methods with examples and discusses the details and limitations of each of these boundary conditions.

Keywords: boundary condition, free-field, massive columns, opensees, site response analysis, wave propagation

Procedia PDF Downloads 183
27005 A Lean Manufacturing Profile of Practices in the Metallurgical Industry: A Methodology for Multivariate Analysis

Authors: M. Jonathan D. Morales, R. Ramón Silva

Abstract:

The purpose of this project is to carry out an analysis and determine the profile of actual lean manufacturing processes in the Metropolitan Area of Bucaramanga. Through the analysis of qualitative and quantitative variables it was possible to establish how these manufacturers develop production practices that ensure their competitiveness and productivity in the market. In this study, a random sample of metallurgic and wrought iron companies was applied, following which a quantitative focus and analysis was used to formulate a qualitative methodology for measuring the level of lean manufacturing procedures in the industry. A qualitative evaluation was also carried out through a multivariate analysis using the Numerical Taxonomy System (NTSYS) program which should allow for the determination of Lean Manufacturing profiles. Through the results it was possible to observe how the companies in the sector are doing with respect to Lean Manufacturing Practices, as well as identify the level of management that these companies practice with respect to this topic. In addition, it was possible to ascertain that there is no one dominant profile in the sector when it comes to Lean Manufacturing. It was established that the companies in the metallurgic and wrought iron industry show low levels of Lean Manufacturing implementation. Each one carries out diverse actions that are insufficient to consolidate a sectoral strategy for developing a competitive advantage which enables them to tie together a production strategy.

Keywords: production line management, metallurgic industry, lean manufacturing, productivity

Procedia PDF Downloads 459
27004 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach

Authors: Kayode Balogun, Femi Ayoola

Abstract:

Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.

Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables

Procedia PDF Downloads 444
27003 Morphological Analysis of Manipuri Language: Wahei-Neinarol

Authors: Y. Bablu Singh, B. S. Purkayashtha, Chungkham Yashawanta Singh

Abstract:

Morphological analysis forms the basic foundation in NLP applications including syntax parsing Machine Translation (MT), Information Retrieval (IR) and automatic indexing in all languages. It is the field of the linguistics; it can provide valuable information for computer based linguistics task such as lemmatization and studies of internal structure of the words. Computational Morphology is the application of morphological rules in the field of computational linguistics, and it is the emerging area in AI, which studies the structure of words, which are formed by combining smaller units of linguistics information, called morphemes: the building blocks of words. Morphological analysis provides about semantic and syntactic role in a sentence. It analyzes the Manipuri word forms and produces several grammatical information associated with the words. The Morphological Analyzer for Manipuri has been tested on 3500 Manipuri words in Shakti Standard format (SSF) using Meitei Mayek as source; thereby an accuracy of 80% has been obtained on a manual check.

Keywords: morphological analysis, machine translation, computational morphology, information retrieval, SSF

Procedia PDF Downloads 326
27002 Artificial Intelligence Assisted Sentiment Analysis of Hotel Reviews Using Topic Modeling

Authors: Sushma Ghogale

Abstract:

With a surge in user-generated content or feedback or reviews on the internet, it has become possible and important to know consumers' opinions about products and services. This data is important for both potential customers and businesses providing the services. Data from social media is attracting significant attention and has become the most prominent channel of expressing an unregulated opinion. Prospective customers look for reviews from experienced customers before deciding to buy a product or service. Several websites provide a platform for users to post their feedback for the provider and potential customers. However, the biggest challenge in analyzing such data is in extracting latent features and providing term-level analysis of the data. This paper proposes an approach to use topic modeling to classify the reviews into topics and conduct sentiment analysis to mine the opinions. This approach can analyse and classify latent topics mentioned by reviewers on business sites or review sites, or social media using topic modeling to identify the importance of each topic. It is followed by sentiment analysis to assess the satisfaction level of each topic. This approach provides a classification of hotel reviews using multiple machine learning techniques and comparing different classifiers to mine the opinions of user reviews through sentiment analysis. This experiment concludes that Multinomial Naïve Bayes classifier produces higher accuracy than other classifiers.

Keywords: latent Dirichlet allocation, topic modeling, text classification, sentiment analysis

Procedia PDF Downloads 97
27001 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data

Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis

Abstract:

Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.

Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction

Procedia PDF Downloads 589
27000 Error Analysis of English Inflection among Thai University Students

Authors: Suwaree Yordchim, Toby J. Gibbs

Abstract:

The linguistic competence of Thai university students majoring in Business English was examined in the context of knowledge of English language inflection, and also various linguistic elements. Errors analysis was applied to the results of the testing. Levels of errors in inflection, tense and linguistic elements were shown to be significantly high for all noun, verb and adjective inflections. Findings suggest that students do not gain linguistic competence in their use of English language inflection, because of interlanguage interference. Implications for curriculum reform and treatment of errors in the classroom are discussed.

Keywords: interlanguage, error analysis, inflection, second language acquisition, Thai students

Procedia PDF Downloads 466
26999 A Comparation Analysis of Islamic Bank Efficiency in the United Kingdom and Indonesia during Eurozone Crisis Using Data Envelopment Analysis

Authors: Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum, Achsania Hendratmi

Abstract:

The purpose of this study is to determine and comparing the level of efficiency of Islamic Banks in Indonesia and United Kingdom during eurozone sovereign debt crisis. This study using a quantitative non-parametric approach with Data Envelopment Analysis (DEA) VRS assumption, and a statistical tool Mann-Whitney U-Test. The samples are 11 Islamic Banks in Indonesia and 4 Islamic Banks in England. This research used mediating approach. Input variable consists of total deposit, asset, and the cost of labour. Output variable consists of financing and profit/loss. This study shows that the efficiency of Islamic Bank in Indonesia and United Kingdom are varied and fluctuated during the observation period. There is no significant different the efficiency performance of Islamic Banks in Indonesia and United Kingdom.

Keywords: data envelopment analysis, efficiency, eurozone crisis, islamic bank

Procedia PDF Downloads 326
26998 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects

Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta

Abstract:

Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.

Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect

Procedia PDF Downloads 214
26997 Modeling the Downstream Impacts of River Regulation on the Grand Lake Meadows Complex using Delft3D FM Suite

Authors: Jaime Leavitt, Katy Haralampides

Abstract:

Numerical modelling has been used to investigate the long-term impact of a large dam on downstream wetland areas, specifically in terms of changing sediment dynamics in the system. The Mactaquac Generating Station (MQGS) is a 672MW run-of-the-river hydroelectric facility, commissioned in 1968 on the mainstem of the Wolastoq|Saint John River in New Brunswick, Canada. New Brunswick Power owns and operates the dam and has been working closely with the Canadian Rivers Institute at UNB Fredericton on a multi-year, multi-disciplinary project investigating the impact the dam has on its surrounding environment. With focus on the downstream river, this research discusses the initialization, set-up, calibration, and preliminary results of a 2-D hydrodynamic model using the Delft3d Flexible Mesh Suite (successor of the Delft3d 4 Suite). The flexible mesh allows the model grid to be structured in the main channel and unstructured in the floodplains and other downstream regions with complex geometry. The combination of grid types improves computational time and output. As the movement of water governs the movement of sediment, the calibrated and validated hydrodynamic model was applied to sediment transport simulations, particularly of the fine suspended sediments. Several provincially significant Protected Natural Areas and federally significant National Wildlife Areas are located 60km downstream of the MQGS. These broad, low-lying floodplains and wetlands are known as the Grand Lake Meadows Complex (GLM Complex). There is added pressure to investigate the impacts of river regulation on these protected regions that rely heavily on natural river processes like sediment transport and flooding. It is hypothesized that the fine suspended sediment would naturally travel to the floodplains for nutrient deposition and replenishment, particularly during the freshet and large storms. The purpose of this research is to investigate the impacts of river regulation on downstream environments and use the model as a tool for informed decision making to protect and maintain biologically productive wetlands and floodplains.

Keywords: hydrodynamic modelling, national wildlife area, protected natural area, sediment transport.

Procedia PDF Downloads 6
26996 Traditional Chinese Medicine Treatment for Coronary Heart Disease: a Meta-Analysis

Authors: Yuxi Wang, Xuan Gao

Abstract:

Traditional Chinese medicine has been used in the treatment of coronary heart disease (CHD) for centuries, and in recent years, the research data on the efficacy of traditional Chinese medicine through clinical trials has gradually increased to explore its real efficacy and internal pharmacology. However, due to the complexity of traditional Chinese medicine prescriptions, the efficacy of each component is difficult to clarify, and pharmacological research is challenging. This study aims to systematically review and clarify the clinical efficacy of traditional Chinese medicine in the treatment of coronary heart disease through a meta-analysis. Based on PubMed, CNKI database, Wanfang data, and other databases, eleven randomized controlled trials and 1091 CHD subjects were included. Two researchers conducted a systematic review of the papers and conducted a meta-analysis supporting the positive therapeutic effect of traditional Chinese medicine in the treatment of CHD.

Keywords: coronary heart disease, Chinese medicine, treatment, meta-analysis

Procedia PDF Downloads 123
26995 An Analysis of Telugu Proverbs in the Light of Endangerment

Authors: Esther, Queeny

Abstract:

The main goal of this paper is to reflect on the overwhelming, rich folklore of Telugu people through their proverbs, which are assumed to be in a state of endangerment. In order to prove the statement made that the proverbs in Telugu are endangered, we have to delve deeper. We hardly found two or three papers related to Telugu proverbs. So, though the process was weary of sorting out the different proverbs in Telugu, to translate them etc. we found it necessary to do a survey in the form of a questionnaire and draw conclusions so that we could address this issue to the readers. We began with a basic assumption that the older generation may have a wider knowledge of their folklore when compared to the younger generation. The results obtained are quite remarkable, which strengthened our assumptions. Statistical analysis was adopted for quantitative analysis. Through this paper, we hope to kindle cultural awareness among the youngsters regarding the use of one’s own mother tongue.

Keywords: sociolinguistics, Telugu proverbs, folklore, endangerment

Procedia PDF Downloads 208