Search results for: spectroscopy data analysis
41195 Damage Identification Using Experimental Modal Analysis
Authors: Niladri Sekhar Barma, Satish Dhandole
Abstract:
Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification
Procedia PDF Downloads 11641194 Building Energy Modeling for Networks of Data Centers
Authors: Eric Kumar, Erica Cochran, Zhiang Zhang, Wei Liang, Ronak Mody
Abstract:
The objective of this article was to create a modelling framework that exposes the marginal costs of shifting workloads across geographically distributed data-centers. Geographical distribution of internet services helps to optimize their performance for localized end users with lowered communications times and increased availability. However, due to the geographical and temporal effects, the physical embodiments of a service's data center infrastructure can vary greatly. In this work, we first identify that the sources of variances in the physical infrastructure primarily stem from local weather conditions, specific user traffic profiles, energy sources, and the types of IT hardware available at the time of deployment. Second, we create a traffic simulator that indicates the IT load at each data-center in the set as an approximator for user traffic profiles. Third, we implement a framework that quantifies the global level energy demands using building energy models and the traffic profiles. The results of the model provide a time series of energy demands that can be used for further life cycle analysis of internet services.Keywords: data-centers, energy, life cycle, network simulation
Procedia PDF Downloads 14741193 Time Series Modelling and Prediction of River Runoff: Case Study of Karkheh River, Iran
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Rainfall and runoff phenomenon is a chaotic and complex outcome of nature which requires sophisticated modelling and simulation methods for explanation and use. Time Series modelling allows runoff data analysis and can be used as forecasting tool. In the paper attempt is made to model river runoff data and predict the future behavioural pattern of river based on annual past observations of annual river runoff. The river runoff analysis and predict are done using ARIMA model. For evaluating the efficiency of prediction to hydrological events such as rainfall, runoff and etc., we use the statistical formulae applicable. The good agreement between predicted and observation river runoff coefficient of determination (R2) display that the ARIMA (4,1,1) is the suitable model for predicting Karkheh River runoff at Iran.Keywords: time series modelling, ARIMA model, river runoff, Karkheh River, CLS method
Procedia PDF Downloads 34141192 Effect of Demineralized Water Purity on the Corrosion Behavior of Steel Alloys
Authors: A. M. El-Aziz, M. Elsehamy, H. Hussein
Abstract:
Steel or stainless steel have reasonable corrosion behavior in water, their corrosion resistance is significantly dependent on the water purity. It was not expected that demineralized water has an aggressive effect on steel alloys, in this study, the effect of water with different purity on steel X52 and stainless steel 316L was investigated. Weight loss and electrochemical measurements were employed to measure the corrosion behavior. Samples were microscopically investigated after test. It was observed that the higher the water purity the more reactive it is. Comparative analysis of the potentiodynamic curves for different water purity showed the aggressiveness of the demineralised water (conductivity of 0.05 microSiemens per cm) over the distilled water. Whereas, the corrosion rates of stainless steel 858 and 623 nm/y for demi and distilled water respectively. On the other hand, the corrosion rates of carbon steel x52 were estimated about 4.8 and 3.6 µm/y for demi and distilled water, respectively. Open circuit potential (OCP) recorded more positive potentials in case of stainless steel than carbon steel in different water purities. Generally, stainless steel illustrated high pitting resistance than carbon steel alloy, the surface film was investigated by scanning electron microscopy (SEM) and analyzed by energy dispersive X-ray spectroscopy (EDX). This behavior was explained based on that demi and distilled water might be considered as ‘hungry water’ in which it wants to be in equilibrium and will pull ions out of the surrounding metals trying to satisfy its ‘hunger’.Keywords: corrosion, demineralized water, distilled water, steel alloys
Procedia PDF Downloads 81441191 A Narrative of Nationalism in Mainstream Media: The US, China, and COVID-19
Authors: Rachel Williams, Shiqi Yang
Abstract:
Our research explores the influence nationalism has had on media coverage of the COVID-19 pandemic as it relates to China in the United States through an inclusive qualitative analysis of two US news networks, Fox News and CNN. In total, the transcripts of sixteen videos uploaded on YouTube, each with more than 100,000 views, were gathered for data processing. Co-occurrence networks generated by KH Coder illuminate the themes and narratives underpinning the reports from Fox News and CNN. The results of in-depth content analysis with keywords suggest that the pandemic has been framed in an ethnopopulist nationalist manner, although to varying degrees between networks. Specifically, the authors found that Fox News is more likely to report hypotheses or statements as a fact; on the contrary, CNN is more likely to quote data and statements from official institutions. Future research into how nationalist narratives have developed in China and in other US news coverage with a more systematic and quantitative method can be conducted to expand on these findings.Keywords: nationalism, media studies, us and china, COVID-19, social media, communication studies
Procedia PDF Downloads 5841190 Automated Multisensory Data Collection System for Continuous Monitoring of Refrigerating Appliances Recycling Plants
Authors: Georgii Emelianov, Mikhail Polikarpov, Fabian Hübner, Jochen Deuse, Jochen Schiemann
Abstract:
Recycling refrigerating appliances plays a major role in protecting the Earth's atmosphere from ozone depletion and emissions of greenhouse gases. The performance of refrigerator recycling plants in terms of material retention is the subject of strict environmental certifications and is reviewed periodically through specialized audits. The continuous collection of Refrigerator data required for the input-output analysis is still mostly manual, error-prone, and not digitalized. In this paper, we propose an automated data collection system for recycling plants in order to deduce expected material contents in individual end-of-life refrigerating appliances. The system utilizes laser scanner measurements and optical data to extract attributes of individual refrigerators by applying transfer learning with pre-trained vision models and optical character recognition. Based on Recognized features, the system automatically provides material categories and target values of contained material masses, especially foaming and cooling agents. The presented data collection system paves the way for continuous performance monitoring and efficient control of refrigerator recycling plants.Keywords: automation, data collection, performance monitoring, recycling, refrigerators
Procedia PDF Downloads 16441189 STD-NMR Based Protein Engineering of the Unique Arylpropionate-Racemase AMDase G74C
Authors: Sarah Gaßmeyer, Nadine Hülsemann, Raphael Stoll, Kenji Miyamoto, Robert Kourist
Abstract:
Enzymatic racemization allows the smooth interconversion of stereocenters under very mild reaction conditions. Racemases find frequent applications in deracemization and dynamic kinetic resolutions. Arylmalonate decarboxylase (AMDase) from Bordetella Bronchiseptica has high structural similarity to amino acid racemases. These cofactor-free racemases are able to break chemically strong CH-bonds under mild conditions. The racemase-like catalytic machinery of mutant G74C conveys it a unique activity in the racemisation of pharmacologically relevant derivates of 2-phenylpropionic acid (profenes), which makes AMDase G74C an interesting object for the mechanistic investigation of cofactor-independent racemases. Structure-guided protein engineering achieved a variant of this unique racemase with 40-fold increased activity in the racemisation of several arylaliphatic carboxylic acids. By saturation–transfer–difference NMR spectroscopy (STD-NMR), substrate binding during catalysis was investigated. All atoms of the substrate showed interactions with the enzyme. STD-NMR measurements revealed distinct nuclear Overhauser effects in experiments with and without molecular conversion. The spectroscopic analysis led to the identification of several amino acid residues whose variation increased the activity of G74C. While single-amino acid exchanges increased the activity moderately, structure-guided saturation mutagenesis yielded a quadruple mutant with a 40 times higher reaction rate. This study presents STD-NMR as versatile tool for the analysis of enzyme-substrate interactions in catalytically competent systems and for the guidance of protein engineering.Keywords: racemase, rational protein design, STD-NMR, structure guided saturation mutagenesis
Procedia PDF Downloads 30541188 Examination of Public Hospital Unions Technical Efficiencies Using Data Envelopment Analysis and Machine Learning Techniques
Authors: Songul Cinaroglu
Abstract:
Regional planning in health has gained speed for developing countries in recent years. In Turkey, 89 different Public Hospital Unions (PHUs) were conducted based on provincial levels. In this study technical efficiencies of 89 PHUs were examined by using Data Envelopment Analysis (DEA) and machine learning techniques by dividing them into two clusters in terms of similarities of input and output indicators. Number of beds, physicians and nurses determined as input variables and number of outpatients, inpatients and surgical operations determined as output indicators. Before performing DEA, PHUs were grouped into two clusters. It is seen that the first cluster represents PHUs which have higher population, demand and service density than the others. The difference between clusters was statistically significant in terms of all study variables (p ˂ 0.001). After clustering, DEA was performed for general and for two clusters separately. It was found that 11% of PHUs were efficient in general, additionally 21% and 17% of them were efficient for the first and second clusters respectively. It is seen that PHUs, which are representing urban parts of the country and have higher population and service density, are more efficient than others. Random forest decision tree graph shows that number of inpatients is a determinative factor of efficiency of PHUs, which is a measure of service density. It is advisable for public health policy makers to use statistical learning methods in resource planning decisions to improve efficiency in health care.Keywords: public hospital unions, efficiency, data envelopment analysis, random forest
Procedia PDF Downloads 12641187 Behavioral Analysis of Stock Using Selective Indicators from Fundamental and Technical Analysis
Authors: Vish Putcha, Chandrasekhar Putcha, Siva Hari
Abstract:
In the current digital era of free trading and pandemic-driven remote work culture, markets worldwide gained momentum for retail investors to trade from anywhere easily. The number of retail traders rose to 24% of the market from 15% at the pre-pandemic level. Most of them are young retail traders with high-risk tolerance compared to the previous generation of retail traders. This trend boosted the growth of subscription-based market predictors and market data vendors. Young traders are betting on these predictors, assuming one of them is correct. However, 90% of retail traders are on the losing end. This paper presents multiple indicators and attempts to derive behavioral patterns from the underlying stocks. The two major indicators that traders and investors follow are technical and fundamental. The famous investor, Warren Buffett, adheres to the “Value Investing” method that is based on a stock’s fundamental Analysis. In this paper, we present multiple indicators from various methods to understand the behavior patterns of stocks. For this research, we picked five stocks with a market capitalization of more than $200M, listed on the exchange for more than 20 years, and from different industry sectors. To study the behavioral pattern over time for these five stocks, a total of 8 indicators are chosen from fundamental, technical, and financial indicators, such as Price to Earning (P/E), Price to Book Value (P/B), Debt to Equity (D/E), Beta, Volatility, Relative Strength Index (RSI), Moving Averages and Dividend yields, followed by detailed mathematical Analysis. This is an interdisciplinary paper between various disciplines of Engineering, Accounting, and Finance. The research takes a new approach to identify clear indicators affecting stocks. Statistical Analysis of the data will be performed in terms of the probabilistic distribution, then follow and then determine the probability of the stock price going over a specific target value. The Chi-square test will be used to determine the validity of the assumed distribution. Preliminary results indicate that this approach is working well. When the complete results are presented in the final paper, they will be beneficial to the community.Keywords: stock pattern, stock market analysis, stock predictions, trading, investing, fundamental analysis, technical analysis, quantitative trading, financial analysis, behavioral analysis
Procedia PDF Downloads 8541186 Sentiment Analysis on the East Timor Accession Process to the ASEAN
Authors: Marcelino Caetano Noronha, Vosco Pereira, Jose Soares Pinto, Ferdinando Da C. Saores
Abstract:
One particularly popular social media platform is Youtube. It’s a video-sharing platform where users can submit videos, and other users can like, dislike or comment on the videos. In this study, we conduct a binary classification task on YouTube’s video comments and review from the users regarding the accession process of Timor Leste to become the eleventh member of the Association of South East Asian Nations (ASEAN). We scrape the data directly from the public YouTube video and apply several pre-processing and weighting techniques. Before conducting the classification, we categorized the data into two classes, namely positive and negative. In the classification part, we apply Support Vector Machine (SVM) algorithm. By comparing with Naïve Bayes Algorithm, the experiment showed SVM achieved 84.1% of Accuracy, 94.5% of Precision, and Recall 73.8% simultaneously.Keywords: classification, YouTube, sentiment analysis, support sector machine
Procedia PDF Downloads 10941185 Investigating Breakdowns in Human Robot Interaction: A Conversation Analysis Guided Single Case Study of a Human-Robot Communication in a Museum Environment
Authors: B. Arend, P. Sunnen, P. Caire
Abstract:
In a single case study, we show how a conversation analysis (CA) approach can shed light onto the sequential unfolding of human-robot interaction. Relying on video data, we are able to show that CA allows us to investigate the respective turn-taking systems of humans and a NAO robot in their dialogical dynamics, thus pointing out relevant differences. Our fine grained video analysis points out occurring breakdowns and their overcoming, when humans and a NAO-robot engage in a multimodally uttered multi-party communication during a sports guessing game. Our findings suggest that interdisciplinary work opens up the opportunity to gain new insights into the challenging issues of human robot communication in order to provide resources for developing mechanisms that enable complex human-robot interaction (HRI).Keywords: human robot interaction, conversation analysis, dialogism, breakdown, museum
Procedia PDF Downloads 30541184 Association Rules Mining and NOSQL Oriented Document in Big Data
Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub
Abstract:
Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL
Procedia PDF Downloads 16241183 Destination Management Organization in the Digital Era: A Data Framework to Leverage Collective Intelligence
Authors: Alfredo Fortunato, Carmelofrancesco Origlia, Sara Laurita, Rossella Nicoletti
Abstract:
In the post-pandemic recovery phase of tourism, the role of a Destination Management Organization (DMO) as a coordinated management system of all the elements that make up a destination (attractions, access, marketing, human resources, brand, pricing, etc.) is also becoming relevant for local territories. The objective of a DMO is to maximize the visitor's perception of value and quality while ensuring the competitiveness and sustainability of the destination, as well as the long-term preservation of its natural and cultural assets, and to catalyze benefits for the local economy and residents. In carrying out the multiple functions to which it is called, the DMO can leverage a collective intelligence that comes from the ability to pool information, explicit and tacit knowledge, and relationships of the various stakeholders: policymakers, public managers and officials, entrepreneurs in the tourism supply chain, researchers, data journalists, schools, associations and committees, citizens, etc. The DMO potentially has at its disposal large volumes of data and many of them at low cost, that need to be properly processed to produce value. Based on these assumptions, the paper presents a conceptual framework for building an information system to support the DMO in the intelligent management of a tourist destination tested in an area of southern Italy. The approach adopted is data-informed and consists of four phases: (1) formulation of the knowledge problem (analysis of policy documents and industry reports; focus groups and co-design with stakeholders; definition of information needs and key questions); (2) research and metadatation of relevant sources (reconnaissance of official sources, administrative archives and internal DMO sources); (3) gap analysis and identification of unconventional information sources (evaluation of traditional sources with respect to the level of consistency with information needs, the freshness of information and granularity of data; enrichment of the information base by identifying and studying web sources such as Wikipedia, Google Trends, Booking.com, Tripadvisor, websites of accommodation facilities and online newspapers); (4) definition of the set of indicators and construction of the information base (specific definition of indicators and procedures for data acquisition, transformation, and analysis). The framework derived consists of 6 thematic areas (accommodation supply, cultural heritage, flows, value, sustainability, and enabling factors), each of which is divided into three domains that gather a specific information need to be represented by a scheme of questions to be answered through the analysis of available indicators. The framework is characterized by a high degree of flexibility in the European context, given that it can be customized for each destination by adapting the part related to internal sources. Application to the case study led to the creation of a decision support system that allows: •integration of data from heterogeneous sources, including through the execution of automated web crawling procedures for data ingestion of social and web information; •reading and interpretation of data and metadata through guided navigation paths in the key of digital story-telling; •implementation of complex analysis capabilities through the use of data mining algorithms such as for the prediction of tourist flows.Keywords: collective intelligence, data framework, destination management, smart tourism
Procedia PDF Downloads 12141182 Influence of Sodium Acetate on Electroless Ni-P Deposits and Effect of Heat Treatment on Corrosion Behavior
Authors: Y. El Kaissi, M. Allam, A. Koulou, M. Galai, M. Ebn Touhami
Abstract:
The aim of our work is to develop an industrial bath of nickel alloy deposit on mild steel. The optimization of the operating parameters made it possible to obtain a stable Ni-P alloy deposition formulation. To understand the reaction mechanism of the deposition process, a kinetic study was performed by cyclic voltammetry and by electrochemical impedance spectroscopy (EIS). The coatings obtained have a very high corrosion resistance in a very aggressive acid medium which increases with the heat treatment.Keywords: cyclic voltammetry, EIS, electroless Ni–P coating, heat treatment, potentiodynamic polarization
Procedia PDF Downloads 30241181 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia
Authors: Melaku Tsehay
Abstract:
The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.Keywords: data quality, immunization, verification factor, pastoralist region
Procedia PDF Downloads 12441180 Tourism Satellite Account: Approach and Information System Development
Authors: Pappas Theodoros, Mihail Diakomihalis
Abstract:
Measuring the economic impact of tourism in a benchmark economy is a global concern, with previous measurements being partial and not fully integrated. Tourism is a phenomenon that requires individual consumption of visitors and which should be observed and measured to reveal, thus, the overall contribution of tourism to an economy. The Tourism Satellite Account (TSA) is a critical tool for assessing the annual growth of tourism, providing reliable measurements. This article introduces a system of TSA information that encompasses all the works of the TSA, including input, storage, management, and analysis of data, as well as additional future functions and enhances the efficiency of tourism data management and TSA collection utility. The methodology and results presented offer insights into the development and implementation of TSA.Keywords: tourism satellite account, information system, data-based tourist account, relation database
Procedia PDF Downloads 8541179 Conductometric Methanol Microsensor Based on Electrospun PVC-Nickel Phthalocyanine Composite Nanofiber Technology
Authors: Ibrahim Musa, Guy Raffin, Marie Hangouet, Nadia Zine, Nicole Jaffrezic-Renault, Abdelhamid Errachid
Abstract:
Due to its application in different domains, such as fuel cell configuration and adulteration of alcoholic beverages, a miniaturized sensor for methanol detection is urgently required. A conductometric microsensor for measuring volatile organic compounds (VOC) was conceived, based on electrospun composite nanofibers of polyvinyl chloride (PVC) doped with nickel phthalocyanine(NiPc) deposited on interdigitated electrodes (IDEs) used transducers. The nanofiber's shape, structure, percent atomic content and thermal properties were studied using analytical techniques, including scanning electron microscopy (SEM), Fourier transform infrared spectroscopy (FTIR), and thermogravimetric analysis (TGA), respectively. The methanol sensor showed good sensitivity (505µS/cm(v/v) ⁻¹), low LOD (15 ppm), short response time (13 s), and short recovery time (15 s). The sensor was 4 times more sensitive to methanol than to ethanol and 19 times more sensitive to methanol than to acetone. Furthermore, the sensor response was unaffected by the interfering water vapor, making it more suitable for VOC sensing in the presence of humidity. The sensor was applied for conductometric detection of methanol in rubbing alcohol.Keywords: composite, methanol, conductometric sensor, electrospun, nanofiber, nickel phthalocyanine, PVC
Procedia PDF Downloads 2341178 Percentile Norms of Heart Rate Variability (HRV) of Indian Sportspersons Withdrawn from Competitive Games and Sports
Authors: Pawan Kumar, Dhananjoy Shaw
Abstract:
Heart rate variability (HRV) is the physiological phenomenon of variation in the time interval between heartbeats and is alterable with fitness, age and different medical conditions including withdrawal/retirement from games/sports. Objectives of the study were to develop (a) percentile norms of heart rate variability (HRV) variables derived from time domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity (b) percentile norms of heart rate variability (HRV) variables derived from frequency domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity. The study was conducted on 430 males. Ages of the sample ranged from 30 to 35 years of same socio-economic status. Date was collected using ECG polygraphs. Data were processed and extracted using frequency domain analysis and time domain analysis. Collected data were computed with percentile from one to hundred. The finding showed that the percentile norms of heart rate variability (HRV) variables derived from time domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity namely, NN50 count (ranged from 1 to 189 score as percentile range). pNN50 count (ranged from .24 to 60.80 score as percentile range). SDNN (ranged from 17.34 to 167.29 score as percentile range). SDSD (ranged from 11.14 to 120.46 score as percentile range). RMMSD (ranged from 11.19 to 120.24 score as percentile range) and SDANN (ranged from 4.02 to 88.75 score as percentile range). The percentile norms of heart rate variability (HRV) variables derived from frequency domain analysis of the Indian sportspersons withdrawn from competitive games/sports pertaining to sympathetic and parasympathetic activity namely Low Frequency (Normalized Power) ranged from 20.68 to 90.49 score as percentile range. High Frequency (Normalized Power) ranged from 14.37 to 81.60 score as percentile range. LF/ HF ratio(ranged from 0.26 to 9.52 score as percentile range). LF (Absolute Power) ranged from 146.79 to 5669.33 score as percentile range. HF (Absolute Power) ranged from 102.85 to 10735.71 score as percentile range and Total Power (Absolute Power) ranged from 471.45 to 25879.23 score as percentile range. Conclusion: The analysis documented percentile norms for time domain analysis and frequency domain analysis for versatile use and evaluation.Keywords: RMSSD, Percentile, SDANN, HF, LF
Procedia PDF Downloads 42041177 An Epidemiological Analysis of the Occurrence of Bovine Brucellosis and Adopted Control Measures in South Africa during the Period 2014 to 2019
Authors: Emily Simango, T. Chitura
Abstract:
Background: Bovine brucellosis is among the most neglected zoonotic diseases in developing countries, where it is endemic and a growing challenge to public health. The development of cost-effective control measures for the disease can only be affirmed by the knowledge of the disease epidemiology and the ability to define its risk profiles. The aim of the study was to document the trend of bovine brucellosis and the control measures adopted following reported cases during the period 2014 to 2019 in South Africa. Methods: Data on confirmed cases of bovine brucellosis was retrieved from the website of the World Organisation of Animal Health (WOAH). Data was analysed using the Statistical Package for Social Sciences (IBM SPSS, 2022) version 29.0. Descriptive analysis (frequencies and percentages) and the Analysis of variance (ANOVA) were utilized for statistical significance (p<0.05). Results: The data retrieved in our study revealed an overall average bovine brucellosis prevalence of 8.48. There were statistically significant differences in bovine brucellosis prevalence across the provinces for the years 2016 and 2019 (p≥0.05), with the Eastern Cape Province having the highest prevalence in both instances. Documented control measures for the disease were limited to killing and disposal of disease cases as well as vaccination of susceptible animals. Conclusion: Bovine brucellosis is real in South Africa, with the risk profiles differing across the provinces. Information on brucellosis control measures in South Africa, as reported to the WOAH, is not comprehensive.Keywords: zoonotic, endemic, Eastern Cape province, vaccination
Procedia PDF Downloads 6641176 An Analysis of the Need of Training for Indian Textile Manufacturing Sector
Authors: Shipra Sharma, Jagat Jerath
Abstract:
Human resource training is an essential element of talent management in the current era of global competitiveness and dynamic trade in the manufacturing industry. Globally, India is behind only China as the largest textile manufacturer. The major challenges faced by the Indian textile manufacturing Industry are low technology levels, growing skill gaps, unorganized structure, lower efficiencies, etc. indicating the need for constant talent up-gradation. Assessment of training needs from a strategic perspective is an essential step for the formulation of effective training. The paper established the significance of training in the Indian textile industry and to determine the training needs on various parameters as presented. 40 HR personnel/s working in the textile and apparel companies based in the industrial region of Punjab, India, were the respondents for the study. The research tool used in this case was a structured questionnaire as per five-point Likert scale. Statistical analysis through descriptive statistics and chi-square test indicated the increased need for training whenever there were technical changes in the organizations. As per the data presented in this study, most of the HR personnel/s agreed that the variables associated with organizational analysis, task analysis, and individual analysis have a statistically significant role to play in determining the need for training in an organization.Keywords: Indian textile manufacturing industry, significance of training, training needs analysis, parameters for training needs assessment
Procedia PDF Downloads 16541175 CoP-Networks: Virtual Spaces for New Faculty’s Professional Development in the 21st Higher Education
Authors: Eman AbuKhousa, Marwan Z. Bataineh
Abstract:
The 21st century higher education and globalization challenge new faculty members to build effective professional networks and partnership with industry in order to accelerate their growth and success. This creates the need for community of practice (CoP)-oriented development approaches that focus on cognitive apprenticeship while considering individual predisposition and future career needs. This work adopts data mining, clustering analysis, and social networking technologies to present the CoP-Network as a virtual space that connects together similar career-aspiration individuals who are socially influenced to join and engage in a process for domain-related knowledge and practice acquisitions. The CoP-Network model can be integrated into higher education to extend traditional graduate and professional development programs.Keywords: clustering analysis, community of practice, data mining, higher education, new faculty challenges, social network, social influence, professional development
Procedia PDF Downloads 18341174 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 14841173 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 29941172 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 16441171 Social Norms around Adolescent Girls’ Marriage Practices in Ethiopia: A Qualitative Exploration
Authors: Dagmawit Tewahido
Abstract:
Purpose: This qualitative study was conducted to explore social norms around adolescent girls’ marriage practices in West Hararghe, Ethiopia, where early marriage is prohibited by law. Methods: Twenty Focus Group Discussions were conducted with Married and Unmarried adolescent girls, adolescent boys and parents of girls using locally developed vignettes. A total of 32 in-depth interviews were conducted with married and unmarried adolescent girls, husbands of adolescent girls and mothers-in-law. Key informant interviews were conducted with 36 district officials. Data analysis was assisted by Open Code computer software. The Social Norms Analysis Plot (SNAP) framework developed by CARE guided the development and analysis of vignettes. A thematic data analysis approach was utilized to summarize the data. Results: Early marriage is seen as a positive phenomenon in our study context, and girls who are not married by the perceived ideal age of 15 are socially sanctioned. They are particularly influenced by their peers to marry. Marrying early is considered a chance given by God and a symbol of good luck. The two common types of marriage are decided: 1) by adolescent girl and boy themselves without seeking parental permission (’Jalaa-deemaa’- meaning ‘to go along’), and 2) by just informing girl’s parents (‘Cabsaa’- meaning ‘to break the culture’). Relatives and marriage brokers also arrange early marriages. Girls usually accept the first marriage proposal regardless of their age. Parents generally tend not to oppose marriage arrangements chosen by their daughters. Conclusions: In the study context social norms encourage early marriage despite the existence of a law prohibiting marriage before the age of eighteen years. Early marriage commonly happens through consensual arrangements between adolescent girls and boys. Interventions to reduce early marriage need to consider the influence of Reference Groups on the decision makers for marriages, especially girls’ own peers.Keywords: adolescent girls, social norms, early marriage, Ethiopia
Procedia PDF Downloads 14041170 A Psycholinguistic Analysis of John Nash’s Hallucinations as Represented in the Film “A Beautiful Mind”
Authors: Rizkia Shafarini
Abstract:
The film A Beautiful Mind explores hallucination in this study. A Beautiful Mind depicts the tale of John Nash, a university student who dislikes studying in class or prefers to study alone. Throughout his life, John Nash has hallucinated, or what is known as schizophrenia, as depicted in the film A Beautiful Mind. The goal of this study was to figure out what hallucinations were, what caused them, and how John Nash managed his hallucinations. In general, this study examines the link between language and mind, or the linguistic relationship portrayed in John Nash's character's speech, as evidenced by his conduct. This study takes a psycholinguistic approach to data analysis by employing qualitative methodologies. Data sources include talks and scenes from the film A Beautiful Mind. Hearing, seeing, and feeling are the scientific results of John Nash's hallucinations in the film A Beautiful Mind. Second, dreams, aspirations, and sickness are the sources of John Nash's hallucinations. Third, John Nash's method of managing hallucinations is to see a doctor without medical or distracting assistance.Keywords: A Beautiful Mind, hallucination, psycholinguistic, John Nash
Procedia PDF Downloads 17241169 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium
Authors: Janne Engblom, Elias Oikarinen
Abstract:
The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.Keywords: dynamic model, panel data, cross-sectional dependence, interaction model
Procedia PDF Downloads 25141168 Identifying Critical Success Factors for Data Quality Management through a Delphi Study
Authors: Maria Paula Santos, Ana Lucas
Abstract:
Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort
Procedia PDF Downloads 21741167 TiO2 Formation after Nanotubes Growth on Ti-15Mo Alloy Surface for Different Annealing Temperatures
Authors: A. L. R. Rangel, J. A. M. Chaves, A. P. R. Alves Claro
Abstract:
Surface modification of titanium and its alloys using TiO2 nanotube growth has been widely studied for biomedical field due to excellent interaction between implant and biological environment. The success of this treatment is directly related to anatase phase formation (TiO2 phase) which affects the cells growth. The aim of this study was to evaluate the phases formed in the nanotubes growth on the Ti-15Mo surface. Nanotubes were grown by electrochemical anodization of the alloy in ammonium fluoride based glycerol electrolyte for 24 hours at 20V. Then, the samples were annealed at 200°,400°, 450°, 500°, 600°, and 800° C for 1 hour. Contact angles measurements, scanning electron microscopy images and X rays diffraction analysis (XRD) were carried out for all samples. Raman Spectroscopy was used to evaluate TiO2 phases transformation in nanotubes samples as well. The results of XRD showed anatase formation for lower temperatures, while at 800 ° C the rutile phase was observed all over the surface. Raman spectra indicate that this phase transition occurs between 500 and 600 °C. The different phases formed have influenced the nanotubes morphologies, since higher annealing temperatures induced agglutination of the TiO2 layer, disrupting the tubular structure. On the other hand, the nanotubes drastically reduced the contact angle, regardless the annealing temperature.Keywords: nanotubes, TiO2, titanium alloys, Ti-15Mo
Procedia PDF Downloads 38441166 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector
Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar
Abstract:
Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability
Procedia PDF Downloads 184