Search results for: panel data analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42459

Search results for: panel data analysis

36849 An Investigation of E-Government by Using GIS and Establishing E-Government in Developing Countries Case Study: Iraq

Authors: Ahmed M. Jamel

Abstract:

Electronic government initiatives and public participation to them are among the indicators of today's development criteria of the countries. After consequent two wars, Iraq's current position in, for example, UN's e-government ranking is quite concerning and did not improve in recent years, either. In the preparation of this work, we are motivated with the fact that handling geographic data of the public facilities and resources are needed in most of the e-government projects. Geographical information systems (GIS) provide most common tools not only to manage spatial data but also to integrate such type of data with nonspatial attributes of the features. With this background, this paper proposes that establishing a working GIS in the health sector of Iraq would improve e-government applications. As the case study, investigating hospital locations in Erbil is chosen.

Keywords: e-government, GIS, Iraq, Erbil

Procedia PDF Downloads 389
36848 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques

Authors: Ved Kulkarni, Karthik Kini

Abstract:

This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.

Keywords: data mining, language processing, artificial neural networks, sentiment analysis

Procedia PDF Downloads 19
36847 A webGIS Methodology to Support Sediments Management in Wallonia

Authors: Nathalie Stephenne, Mathieu Veschkens, Stéphane Palm, Christophe Charlemagne, Jacques Defoux

Abstract:

According to Europe’s first River basin Management Plans (RBMPs), 56% of European rivers failed to achieve the good status targets of the Water Framework Directive WFD. In Central European countries such as Belgium, even more than 80% of rivers failed to achieve the WFD quality targets. Although the RBMP’s should reduce the stressors and improve water body status, their potential to address multiple stress situations is limited due to insufficient knowledge on combined effects, multi-stress, prioritization of measures, impact on ecology and implementation effects. This paper describes a webGis prototype developed for the Walloon administration to improve the communication and the management of sediment dredging actions carried out in rivers and lakes in the frame of RBMPs. A large number of stakeholders are involved in the management of rivers and lakes in Wallonia. They are in charge of technical aspects (client and dredging operators, organizations involved in the treatment of waste…), management (managers involved in WFD implementation at communal, provincial or regional level) or policy making (people responsible for policy compliance or legislation revision). These different kinds of stakeholders need different information and data to cover their duties but have to interact closely at different levels. Moreover, information has to be shared between them to improve the management quality of dredging operations within the ecological system. In the Walloon legislation, leveling dredged sediments on banks requires an official authorization from the administration. This request refers to spatial information such as the official land use map, the cadastral map, the distance to potential pollution sources. The production of a collective geodatabase can facilitate the management of these authorizations from both sides. The proposed internet system integrates documents, data input, integration of data from disparate sources, map representation, database queries, analysis of monitoring data, presentation of results and cartographic visualization. A prototype of web application using the API geoviewer chosen by the Geomatic department of the SPW has been developed and discussed with some potential users to facilitate the communication, the management and the quality of the data. The structure of the paper states the why, what, who and how of this communication tool.

Keywords: sediments, web application, GIS, rivers management

Procedia PDF Downloads 405
36846 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients

Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori

Abstract:

Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.

Keywords: asthma, datamining, classification, machine learning

Procedia PDF Downloads 447
36845 Comparative Settlement Analysis on the under of Embankment with Empirical Formulas and Settlement Plate Measurement for Reducing Building Crack around of Embankments

Authors: Safitri Nur Wulandari, M. Ivan Adi Perdana, Prathisto L. Panuntun Unggul, R. Dary Wira Mahadika

Abstract:

In road construction on the soft soil, we need a soil improvement method to improve the soil bearing capacity of the land base so that the soil can withstand the traffic loads. Most of the land in Indonesia has a soft soil, where soft soil is a type of clay that has the consistency of very soft to medium stiff, undrained shear strength, Cu <0:25 kg/cm2, or the estimated value of NSPT <5 blows/ft. This study focuses on the analysis of the effect on preloading load (embarkment) to the amount of settlement ratio on the under of embarkment that will impact on the building cracks around of embarkment. The method used in this research is a superposition method for embarkment distribution on 27 locations with undisturbed soil samples at some borehole point in Java and Kalimantan, Indonesia. Then correlating the results of settlement plate monitoring on the field with Asaoka method. The results of settlement plate monitoring taken from an embarkment of Ahmad Yani airport in Semarang on 32 points. Where the value of Cc (index compressible) soil data based on some laboratory test results, while the value of Cc is not tested obtained from empirical formula Ardhana and Mochtar, 1999. From this research, the results of the field monitoring showed almost the same results with an empirical formulation with the standard deviation of 4% where the formulation of the empirical results of this analysis obtained by linear formula. Value empirical linear formula is to determine the effect of compression heap area as high as 4,25 m is 3,1209x + y = 0.0026 for the slope of the embankment 1: 8 for the same analysis with an initial height of embankment on the field. Provided that at the edge of the embankment settlement worth is not equal to 0 but at a quarter of embankment has a settlement ratio average 0.951 and at the edge of embankment has a settlement ratio 0,049. The influence areas around of embankment are approximately 1 meter for slope 1:8 and 7 meters for slope 1:2. So, it can cause the building cracks, to build in sustainable development.

Keywords: building cracks, influence area, settlement plate, soft soil, empirical formula, embankment

Procedia PDF Downloads 344
36844 Decision Support System in Air Pollution Using Data Mining

Authors: E. Fathallahi Aghdam, V. Hosseini

Abstract:

Environmental pollution is not limited to a specific region or country; that is why sustainable development, as a necessary process for improvement, pays attention to issues such as destruction of natural resources, degradation of biological system, global pollution, and climate change in the world, especially in the developing countries. According to the World Health Organization, as a developing city, Tehran (capital of Iran) is one of the most polluted cities in the world in terms of air pollution. In this study, three pollutants including particulate matter less than 10 microns, nitrogen oxides, and sulfur dioxide were evaluated in Tehran using data mining techniques and through Crisp approach. The data from 21 air pollution measuring stations in different areas of Tehran were collected from 1999 to 2013. Commercial softwares Clementine was selected for this study. Tehran was divided into distinct clusters in terms of the mentioned pollutants using the software. As a data mining technique, clustering is usually used as a prologue for other analyses, therefore, the similarity of clusters was evaluated in this study through analyzing local conditions, traffic behavior, and industrial activities. In fact, the results of this research can support decision-making system, help managers improve the performance and decision making, and assist in urban studies.

Keywords: data mining, clustering, air pollution, crisp approach

Procedia PDF Downloads 428
36843 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 134
36842 A Modular Reactor for Thermochemical Energy Storage Examination of Ettringite-Based Materials

Authors: B. Chen, F. Kuznik, M. Horgnies, K. Johannes, V. Morin, E. Gengembre

Abstract:

More attention on renewable energy has been done after the achievement of Paris Agreement against climate change. Solar-based technology is supposed to be one of the most promising green energy technologies for residential buildings since its widely thermal usage for hot water and heating. However, the seasonal mismatch between its production and consumption makes buildings need an energy storage system to improve the efficiency of renewable energy use. Indeed, there exist already different kinds of energy storage systems using sensible or latent heat. With the consideration of energy dissipation during storage and low energy density for above two methods, thermochemical energy storage is then recommended. Recently, ettringite (3CaO∙Al₂O₃∙3CaSO₄∙32H₂O) based materials have been reported as potential thermochemical storage materials because of high energy density (~500 kWh/m³), low material cost (700 €/m³) and low storage temperature (~60-70°C), compared to reported salt hydrates like SrBr₂·6H₂O (42 k€/m³, ~80°C), LaCl₃·7H₂O (38 k€/m³, ~100°C) and MgSO₄·7H₂O (5 k€/m³, ~150°C). Therefore, they have the possibility to be largely used in building sector with being coupled to normal solar panel systems. On the other side, the lack in terms of extensive examination leads to poor knowledge on their thermal properties and limit maturity of this technology. The aim of this work is to develop a modular reactor adapting to thermal characterizations of ettringite-based material particles of different sizes. The filled materials in the reactor can be self-compacted vertically to ensure hot air or humid air goes through homogenously. Additionally, quick assembly and modification of reactor, like LEGO™ plastic blocks, make it suitable to distinct thermochemical energy storage material samples with different weights (from some grams to several kilograms). In our case, quantity of stored and released energy, best work conditions and even chemical durability of ettringite-based materials have been investigated.

Keywords: dehydration, ettringite, hydration, modular reactor, thermochemical energy storage

Procedia PDF Downloads 138
36841 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 381
36840 Relationship Between Pain Intensity at the Time of the Hamstring Muscle Injury and Hamstring Muscle Lesion Volume Measured by Magnetic Resonance Imaging

Authors: Grange Sylvain, Plancher Ronan, Reurink Guustav, Croisille Pierre, Edouard Pascal

Abstract:

The primary objective of this study was to analyze the potential correlation between the pain experienced at the time of a hamstring muscle injury and the volume of the lesion measured on MRI. The secondary objectives were to analyze a correlation between this pain and the lesion grade as well as the affected hamstring muscle. We performed a retrospective analysis of the data collected in a prospective, multicenter, non-interventional cohort study (HAMMER). Patients with suspected hamstring muscle injury had an MRI after the injury and at the same time were evaluated for their pain intensity experienced at the time of the injury with a Numerical Pain Rating Scale (NPRS) from 0 to 10. A total of 61 patients were included in the present analysis. MRIs were performed in an average of less than 8 days. There was a significant correlation between pain and the injury volume (r=0.287; p=0.025). There was no significant correlation between the pain and the lesion grade (p>0.05), nor between the pain and affected hamstring muscle (p>0.05). Pain at the time of injury appeared to be correlated with the volume of muscle affected. These results confirm the value of a clinical approach in the initial evaluation of hamstring injuries to better select patients eligible for further imaging.

Keywords: hamstring muscle injury, MRI, volume lesion, pain

Procedia PDF Downloads 98
36839 Analysis and Modeling of Vibratory Signals Based on LMD for Rolling Bearing Fault Diagnosis

Authors: Toufik Bensana, Slimane Mekhilef, Kamel Tadjine

Abstract:

The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally non-stationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA) and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. the results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.

Keywords: fault diagnosis, local mean decomposition, rolling element bearing, vibration analysis

Procedia PDF Downloads 408
36838 Utilizing Temporal and Frequency Features in Fault Detection of Electric Motor Bearings with Advanced Methods

Authors: Mohammad Arabi

Abstract:

The development of advanced technologies in the field of signal processing and vibration analysis has enabled more accurate analysis and fault detection in electrical systems. This research investigates the application of temporal and frequency features in detecting faults in electric motor bearings, aiming to enhance fault detection accuracy and prevent unexpected failures. The use of methods such as deep learning algorithms and neural networks in this process can yield better results. The main objective of this research is to evaluate the efficiency and accuracy of methods based on temporal and frequency features in identifying faults in electric motor bearings to prevent sudden breakdowns and operational issues. Additionally, the feasibility of using techniques such as machine learning and optimization algorithms to improve the fault detection process is also considered. This research employed an experimental method and random sampling. Vibration signals were collected from electric motors under normal and faulty conditions. After standardizing the data, temporal and frequency features were extracted. These features were then analyzed using statistical methods such as analysis of variance (ANOVA) and t-tests, as well as machine learning algorithms like artificial neural networks and support vector machines (SVM). The results showed that using temporal and frequency features significantly improves the accuracy of fault detection in electric motor bearings. ANOVA indicated significant differences between normal and faulty signals. Additionally, t-tests confirmed statistically significant differences between the features extracted from normal and faulty signals. Machine learning algorithms such as neural networks and SVM also significantly increased detection accuracy, demonstrating high effectiveness in timely and accurate fault detection. This study demonstrates that using temporal and frequency features combined with machine learning algorithms can serve as an effective tool for detecting faults in electric motor bearings. This approach not only enhances fault detection accuracy but also simplifies and streamlines the detection process. However, challenges such as data standardization and the cost of implementing advanced monitoring systems must also be considered. Utilizing temporal and frequency features in fault detection of electric motor bearings, along with advanced machine learning methods, offers an effective solution for preventing failures and ensuring the operational health of electric motors. Given the promising results of this research, it is recommended that this technology be more widely adopted in industrial maintenance processes.

Keywords: electric motor, fault detection, frequency features, temporal features

Procedia PDF Downloads 49
36837 Teaching during the Pandemic Using a Feminist Pedagogy: Classroom Conversations and Practices

Authors: T. Northcut, A. Rai, N. Perkins

Abstract:

Background: The COVID-19 pandemic has had a serious impact on academia in general and social work education in particular, changing permanently the way in which we approach educating students. The new reality of the pandemic coupled with the much-needed focus on racism across the country inspired and required educators to get creative with their teaching styles in order to disrupt the power imbalance in the classroom and attend to the multiple layers of needs of diverse students in precarious sociological and economic circumstances. This paper highlights research examining educators with distinctive positionalities and approaches to classroom instruction who use feminist and antiracist pedagogies while adapting to online teaching during the pandemic. Despite being feminist scholars, whose ideologies developed during different waves of feminism, our commitment to having student-led classrooms, liberation, and equity of all, and striving for social change, unified our feminist teaching pedagogies as well as provided interpersonal support. Methodology: Following a narrative qualitative inquiry methodology, the five authors of this paper came together to discuss our pedagogical styles and underlying values using Zoom in a series of six conversations. Narrative inquiry is an appropriate method to use when researchers are bound by common stories or personal experiences. The use of feminist pedagogy in the classroom before and during the pandemic guided the discussions. After six sessions, we reached the point of data saturation. All data from the dialogic process was recorded and transcribed. We used in vivo, narrative, and descriptive coding for the data analytic process. Results: Analysis of the data revealed several themes, which included (1) the influence of our positionalities as an intersection of race, sexual orientation, gender, and years of teaching experience in the classroom, (2) the meaning and variations between different liberatory pedagogical approaches, (3) the tensions between these approaches and institutional policies and practices, (4) the role of self-reflection in everyday teaching, (5) the distinctions between theory and practice and its utility for students, and (6) the challenges of applying a feminist-centered pedagogical approach during the pandemic while utilizing an online platform. As a collective, we discussed several challenges that limited the use of our feminist pedagogical approaches due to instruction through Zoom.

Keywords: feminist, pedagogy, COVID, zoom

Procedia PDF Downloads 44
36836 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study

Authors: Ana Serafimovic, Karthik Devarajan

Abstract:

Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.

Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence

Procedia PDF Downloads 246
36835 Comparative Spatial Analysis of a Re-Arranged Hospital Building

Authors: Burak Köken, Hatice D. Arslan, Bilgehan Y. Çakmak

Abstract:

Analyzing the relation networks between the hospital buildings which have complex structure and distinctive spatial relationships is quite difficult. The hospital buildings which require specialty in spatial relationship solutions during design and self-innovation through the developing technology should survive and keep giving service even after the disasters such as earthquakes. In this study, a hospital building where the load-bearing system was strengthened because of the insufficient earthquake performance and the construction of an additional building was required to meet the increasing need for space was discussed and a comparative spatial evaluation of the hospital building was made with regard to its status before the change and after the change. For this reason, spatial organizations of the building before change and after the change were analyzed by means of Space Syntax method and the effects of the change on space organization parameters were searched by applying an analytical procedure. Using Depthmap UCL software, connectivity, visual mean depth, beta and visual integration analyses were conducted. Based on the data obtained after the analyses, it was seen that the relationships between spaces of the building increased after the change and the building has become more explicit and understandable for the occupants. Furthermore, it was determined according to findings of the analysis that the increase in depth causes difficulty in perceiving the spaces and the changes considering this problem generally ease spatial use.

Keywords: architecture, hospital building, space syntax, strengthening

Procedia PDF Downloads 521
36834 Health Information Needs and Utilization of Information and Communication Technologies by Medical Professionals in a Northern City of India

Authors: Sonika Raj, Amarjeet Singh, Vijay Lakshmi Sharma

Abstract:

Introduction: In 21st century, due to revolution in Information and Communication Technologies (ICTs), there has been phenomenal development in quality and quantity of knowledge in the field of medical science. So, the access to relevant information to physicians is critical to the delivery of effective healthcare services to patients. The study was conducted to assess the information needs and attitudes of the medical professionals; to determine the sources and channels of information used by them; to ascertain the current usage of ICTs and the barriers faced by them in utilization of ICTs in health information access. Methodology: This descriptive cross-sectional study was carried in 2015 on hundred medical professionals working in public and private sectors of Chandigarh. The study used both quantitative and qualitative method for data collection. A semi structured questionnaire and interview schedule was used to collect data on information seeking needs, access to ICTs and barriers to healthcare information access. Five Data analysis was done using SPSS-16 and qualitative data was analyzed using thematic approach. Results: The most preferred sources to access healthcare information were internet (85%), trainings (61%) and communication with colleagues (57%). They wanted information on new drug therapy and latest developments in respective fields. All had access to computer with but almost half assessed their computer knowledge as average and only 3% had received training regarding usage. Educational status (p=0.004), place of work (p=0.004), number of years in job (p=0.004) and sector of job (p=0.04) of doctors were found to be significantly associated with their active search for information. The major themes that emerged from in-views were need; types and sources of healthcare information; exchange of information among different levels of healthcare providers; usage of ICTs to obtain and share information; barriers to access of healthcare information and quality of health information materials and involvement in their development process Conclusion and Recommendations: The medical professionals need information in their in their due course of work. However, information needs of medical professionals were not being adequately met. There should be training of professional regarding internet skills and the course on bioinformatics should be incorporated in the curricula of medical students. The policy framework must be formulated that will encourage and promote the use of ICTs as tools for health information access and dissemination.

Keywords: health information, ICTs, medical professionals, qualitative

Procedia PDF Downloads 349
36833 Gait Analysis in Total Knee Arthroplasty

Authors: Neeraj Vij, Christian Leber, Kenneth Schmidt

Abstract:

Introduction: Total knee arthroplasty is a common procedure. It is well known that the biomechanics of the knee do not fully return to their normal state. Motion analysis has been used to study the biomechanics of the knee after total knee arthroplasty. The purpose of this scoping review is to summarize the current use of gait analysis in total knee arthroplasty and to identify the preoperative motion analysis parameters for which a systematic review aimed at determining the reliability and validity may be warranted. Materials and Methods: This IRB-exempt scoping review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist strictly. Five search engines were searched for a total of 279 articles. Articles underwent a title and abstract screening process followed by full-text screening. Included articles were placed in the following sections: the role of gait analysis as a research tool for operative decisions, other research applications for motion analysis in total knee arthroplasty, gait analysis as a tool in predicting radiologic outcomes, gait analysis as a tool in predicting clinical outcomes. Results: Eleven articles studied gait analysis as a research tool in studying operative decisions. Motion analysis is currently used to study surgical approaches, surgical techniques, and implant choice. Five articles studied other research applications for motion analysis in total knee arthroplasty. Other research applications for motion analysis currently include studying the role of the unicompartmental knee arthroplasty and novel physical therapy protocols aimed at optimizing post-operative care. Two articles studied motion analysis as a tool for predicting radiographic outcomes. Preoperative gait analysis has identified parameters than can predict postoperative tibial component migration. 15 articles studied motion analysis in conjunction with clinical scores. Conclusions: There is a broad range of applications within the research domain of total knee arthroplasty. The potential application is likely larger. However, the current literature is limited by vague definitions of ‘gait analysis’ or ‘motion analysis’ and a limited number of articles with preoperative and postoperative functional and clinical measures. Knee adduction moment, knee adduction impulse, total knee range of motion, varus angle, cadence, stride length, and velocity have the potential for integration into composite clinical scores. A systematic review aimed at determining the validity, reliability, sensitivities, and specificities of these variables is warranted.

Keywords: motion analysis, joint replacement, patient-reported outcomes, knee surgery

Procedia PDF Downloads 94
36832 The Role of Gender in Influencing Public Speaking Anxiety

Authors: Fadil Elmenfi, Ahmed Gaibani

Abstract:

This study investigates the role of gender in influencing public speaking anxiety. Questionnaire survey was administered to the samples of the study. Technique of correlation and descriptive analysis will be further applied to the data collected to determine the relationship between gender and public speaking anxiety. This study could serve as a guide to identify the effects of gender differences on public speaking anxiety and provide necessary advice on how to design a way of coping with or overcoming public speaking anxiety.

Keywords: across culture, communication, English language competence, gender, postgraduate students, speaking anxiety

Procedia PDF Downloads 561
36831 An Unbiased Profiling of Immune Repertoire via Sequencing and Analyzing T-Cell Receptor Genes

Authors: Yi-Lin Chen, Sheng-Jou Hung, Tsunglin Liu

Abstract:

Adaptive immune system recognizes a wide range of antigens via expressing a large number of structurally distinct T cell and B cell receptor genes. The distinct receptor genes arise from complex rearrangements called V(D)J recombination, and constitute the immune repertoire. A common method of profiling immune repertoire is via amplifying recombined receptor genes using multiple primers and high-throughput sequencing. This multiplex-PCR approach is efficient; however, the resulting repertoire can be distorted because of primer bias. To eliminate primer bias, 5’ RACE is an alternative amplification approach. However, the application of RACE approach is limited by its low efficiency (i.e., the majority of data are non-regular receptor sequences, e.g., containing intronic segments) and lack of the convenient tool for analysis. We propose a computational tool that can correctly identify non-regular receptor sequences in RACE data via aligning receptor sequences against the whole gene instead of only the exon regions as done in all other tools. Using our tool, the remaining regular data allow for an accurate profiling of immune repertoire. In addition, a RACE approach is improved to yield a higher fraction of regular T-cell receptor sequences. Finally, we quantify the degree of primer bias of a multiplex-PCR approach via comparing it to the RACE approach. The results reveal significant differences in frequency of VJ combination by the two approaches. Together, we provide a new experimental and computation pipeline for an unbiased profiling of immune repertoire. As immune repertoire profiling has many applications, e.g., tracing bacterial and viral infection, detection of T cell lymphoma and minimal residual disease, monitoring cancer immunotherapy, etc., our work should benefit scientists who are interested in the applications.

Keywords: immune repertoire, T-cell receptor, 5' RACE, high-throughput sequencing, sequence alignment

Procedia PDF Downloads 194
36830 Crime Prevention with Artificial Intelligence

Authors: Mehrnoosh Abouzari, Shahrokh Sahraei

Abstract:

Today, with the increase in quantity and quality and variety of crimes, the discussion of crime prevention has faced a serious challenge that human resources alone and with traditional methods will not be effective. One of the developments in the modern world is the presence of artificial intelligence in various fields, including criminal law. In fact, the use of artificial intelligence in criminal investigations and fighting crime is a necessity in today's world. The use of artificial intelligence is far beyond and even separate from other technologies in the struggle against crime. Second, its application in criminal science is different from the discussion of prevention and it comes to the prediction of crime. Crime prevention in terms of the three factors of the offender, the offender and the victim, following a change in the conditions of the three factors, based on the perception of the criminal being wise, and therefore increasing the cost and risk of crime for him in order to desist from delinquency or to make the victim aware of self-care and possibility of exposing him to danger or making it difficult to commit crimes. While the presence of artificial intelligence in the field of combating crime and social damage and dangers, like an all-seeing eye, regardless of time and place, it sees the future and predicts the occurrence of a possible crime, thus prevent the occurrence of crimes. The purpose of this article is to collect and analyze the studies conducted on the use of artificial intelligence in predicting and preventing crime. How capable is this technology in predicting crime and preventing it? The results have shown that the artificial intelligence technologies in use are capable of predicting and preventing crime and can find patterns in the data set. find large ones in a much more efficient way than humans. In crime prediction and prevention, the term artificial intelligence can be used to refer to the increasing use of technologies that apply algorithms to large sets of data to assist or replace police. The use of artificial intelligence in our debate is in predicting and preventing crime, including predicting the time and place of future criminal activities, effective identification of patterns and accurate prediction of future behavior through data mining, machine learning and deep learning, and data analysis, and also the use of neural networks. Because the knowledge of criminologists can provide insight into risk factors for criminal behavior, among other issues, computer scientists can match this knowledge with the datasets that artificial intelligence uses to inform them.

Keywords: artificial intelligence, criminology, crime, prevention, prediction

Procedia PDF Downloads 76
36829 The Effectiveness of Sleep Behavioral Interventions during the Third Trimester of Pregnancy on Sleep Quality and Postpartum Depression in a Randomized Clinical Controlled Trial

Authors: Somaye Ghafarpour, Kamran Yazdanbakhsh, Mohamad Reza Zarbakhsh, Simin Hosseinian, Samira Ghafarpour

Abstract:

Unsatisfactory sleep quality is one of the most common complications of pregnancy, which can predispose mothers to postpartum depression, requiring implementing effective psychological interventions to prevent and modify behaviors accentuating sleep problems. This study was a randomized clinical controlled trial with a pre-test/post-test design aiming to investigate the effectiveness of sleep behavioral interventions during the third trimester of pregnancy on sleep quality and postpartum depression. A total of 50 pregnant mothers in the 26-30 weeks of pregnancy suffering from sleep problems (based on the score obtained from the Pittsburgh Sleep Questionnaire) were randomized into two groups (control and intervention, n= 25 per group). The data were collected using interviews, the Pittsburgh Sleep Quality Index (PSQI), and the Edinburgh Postnatal Depression Scale (EPDS) were used. The participants in the intervention group received eight 60-minute sessions of combinational training for behavioral therapy techniques. At the end of the intervention and four weeks after delivery, sleep quality and postpartum depression were evaluated. Considering that the Kolmogorov Smirnov test confirmed the normal distribution of the data, the independent t-test and analysis of covariance were used to analyze the data, showing that the behavioral interventions were effective on the overall sleep quality after delivery (p=0.001); however, no statistically significant effects were observed on postpartum depression, the sub-scales of sleep disorders, and daily functioning (p>0.05). Considering the potential effectiveness of behavioral interventions in improving sleep quality and alleviating insomnia symptoms, it is recommended to implement such measures as an effective intervention to prevent or treat these problems during prenatal and postnatal periods.

Keywords: behavioral interventions, sleep quality, postpartum depression, pregnancy, delivery

Procedia PDF Downloads 70
36828 A Study of Student Satisfaction of the Suan Sunandha Rajabhat University Radio Station

Authors: Prapoj Na Bangchang

Abstract:

The research aimed to study the satisfaction of Suan Sunandha Rajabhat University students towards the university radio station which broadcasts in both analog on FM 97.25 MHz and online via the university website. The sample used in this study consists of undergraduate students year 1 to year 4 from 6 faculties i.e. Faculty of Education, Faculty of Humanities and Social Sciences, Faculty of Management Science and Faculty of Industrial Technology, and Faculty of Fine and Applied Arts totaling 200 students. The tools used for data collection is survey. Data analysis applied statistics that are percentage, mean and standard deviation. The results showed that Suan Sunandha Rajabhat University students were satisfied to the place of listening service, followed by channels of broadcasting that cover both analog signals on 97.25 MHz FM and online via the Internet. However, the satisfaction level of the content offered was very low. Most of the students want the station to improve the content. Entertainment content was requested the most, followed by sports content. The lowest satisfaction level is with the broadcasting quality through analog signal. Most students asked the station to improve on the issue. However, overall, Suan Sunandha Rajabhat University students were satisfied with the university radio station broadcasted online via the university website.

Keywords: satisfaction, students, radio station, Suan Sunandha Rajabhat University

Procedia PDF Downloads 272
36827 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.

Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure

Procedia PDF Downloads 328
36826 A Critical Discourse Analysis of Citizenship Education Textbook for Primary School Students in Singapore

Authors: Ren Boyuan

Abstract:

This study focuses on how the Character and Citizenship Education textbook in Singapore primary schools deliver preferred and desired qualities to students and therefore reveals how discourse in textbooks can facilitate and perpetuate certain social practices. In this way, this study also serves to encourage the critical thinking of textbook writers and school educators by unveiling the nuanced message through language use that facilitates the perpetuation of social practices in a society. In Singapore, Character and Citizenship Education is a compulsory subject for primary school students. Under the framework of 21st Century Competencies, Character and Citizenship Education in Singapore aims to help students thrive in this fast-changing world. The Singapore government is involved in the development of CCE curriculum in schools from primary schools to pre-university. Inevitably, the CCE curriculum is not free from ideological influences. This qualitative study utilizes Fairclough’s three-dimensional theory and his framework of three assumptions to analyze the Character and Citizenship Education textbook for Primary 1 and to reveal ideologies in this textbook. Data for the analysis in this study are the textual parts of the whole textbook for Primary 1 students as this book is used at the beginning of citizenship education in primary schools. It is significant because it promotes messages about CCE to the foundation years of a child's education. The findings of this study show that the four revealed ideologies, namely pragmatism, communitarianism, nationalism, and multiculturalism, are not only dated back in the national history but also updated and explained by the current demands for Singapore’s thriving and prosperity in a sustainable term. This study ends with a discussion of the implications of this study. By pointing out the ideologies in this textbook and how they are embedded in the discourse, this study may help teachers and textbook writers realize the possible political involvement in the book and therefore develop their recognition of the implicit influence of lexical choice on their teaching and writing. In addition, by exploring the ideologies in this book and comparing them with ideologies in past textbooks, this study helps researchers in this area on how language influences readers and reflects certain social demands.

Keywords: citizenship education, critical discourse analysis, sociolinguistics, textbook analysis

Procedia PDF Downloads 64
36825 Machine Learning Approaches Based on Recency, Frequency, Monetary (RFM) and K-Means for Predicting Electrical Failures and Voltage Reliability in Smart Cities

Authors: Panaya Sudta, Wanchalerm Patanacharoenwong, Prachya Bumrungkun

Abstract:

As With the evolution of smart grids, ensuring the reliability and efficiency of electrical systems in smart cities has become crucial. This paper proposes a distinct approach that combines advanced machine learning techniques to accurately predict electrical failures and address voltage reliability issues. This approach aims to improve the accuracy and efficiency of reliability evaluations in smart cities. The aim of this research is to develop a comprehensive predictive model that accurately predicts electrical failures and voltage reliability in smart cities. This model integrates RFM analysis, K-means clustering, and LSTM networks to achieve this objective. The research utilizes RFM analysis, traditionally used in customer value assessment, to categorize and analyze electrical components based on their failure recency, frequency, and monetary impact. K-means clustering is employed to segment electrical components into distinct groups with similar characteristics and failure patterns. LSTM networks are used to capture the temporal dependencies and patterns in customer data. This integration of RFM, K-means, and LSTM results in a robust predictive tool for electrical failures and voltage reliability. The proposed model has been tested and validated on diverse electrical utility datasets. The results show a significant improvement in prediction accuracy and reliability compared to traditional methods, achieving an accuracy of 92.78% and an F1-score of 0.83. This research contributes to the proactive maintenance and optimization of electrical infrastructures in smart cities. It also enhances overall energy management and sustainability. The integration of advanced machine learning techniques in the predictive model demonstrates the potential for transforming the landscape of electrical system management within smart cities. The research utilizes diverse electrical utility datasets to develop and validate the predictive model. RFM analysis, K-means clustering, and LSTM networks are applied to these datasets to analyze and predict electrical failures and voltage reliability. The research addresses the question of how accurately electrical failures and voltage reliability can be predicted in smart cities. It also investigates the effectiveness of integrating RFM analysis, K-means clustering, and LSTM networks in achieving this goal. The proposed approach presents a distinct, efficient, and effective solution for predicting and mitigating electrical failures and voltage issues in smart cities. It significantly improves prediction accuracy and reliability compared to traditional methods. This advancement contributes to the proactive maintenance and optimization of electrical infrastructures, overall energy management, and sustainability in smart cities.

Keywords: electrical state prediction, smart grids, data-driven method, long short-term memory, RFM, k-means, machine learning

Procedia PDF Downloads 56
36824 Automatic Fluid-Structure Interaction Modeling and Analysis of Butterfly Valve Using Python Script

Authors: N. Guru Prasath, Sangjin Ma, Chang-Wan Kim

Abstract:

A butterfly valve is a quarter turn valve which is used to control the flow of a fluid through a section of pipe. Generally, butterfly valve is used in wide range of applications such as water distribution, sewage, oil and gas plants. In particular, butterfly valve with larger diameter finds its immense applications in hydro power plants to control the fluid flow. In-lieu with the constraints in cost and size to run laboratory setup, analysis of large diameter values will be mostly studied by computational method which is the best and inexpensive solution. For fluid and structural analysis, CFD and FEM software is used to perform large scale valve analyses, respectively. In order to perform above analysis in butterfly valve, the CAD model has to recreate and perform mesh in conventional software’s for various dimensions of valve. Therefore, its limitation is time consuming process. In-order to overcome that issue, python code was created to outcome complete pre-processing setup automatically in Salome software. Applying dimensions of the model clearly in the python code makes the running time comparatively lower and easier way to perform analysis of the valve. Hence, in this paper, an attempt was made to study the fluid-structure interaction (FSI) of butterfly valves by varying the valve angles and dimensions using python code in pre-processing software, and results are produced.

Keywords: butterfly valve, flow coefficient, automatic CFD analysis, FSI analysis

Procedia PDF Downloads 241
36823 The Youth Employment Peculiarities in Post-Soviet Georgia

Authors: M. Lobzhanidze, N. Damenia

Abstract:

The article analyzes the current structural changes in the economy of Georgia, liberalization and integration processes of the economy. In accordance with this analysis, the peculiarities and the problems of youth employment are revealed. In the paper, the Georgian labor market and its contradictions are studied. Based on the analysis of materials, the socio-economic losses caused by the long-term and mass unemployment of young people are revealed, the objective and subjective circumstances of getting higher education are studied. The youth employment and unemployment rates are analyzed. Based on the research, the factors that increase unemployment are identified. According to the analysis of the youth employment, it has appeared that the unemployment share in the number of economically active population has increased in the younger age group. It demonstrates the high requirements of the labour market in terms of the quality of the workforce. Also, it is highlighted that young people are exposed to a highly paid job. The following research methods are applied in the presented paper: statistical (selection, grouping, observation, trend, etc.) and qualitative research (in-depth interview), as well as analysis, induction and comparison methods. The article presents the data by the National Statistics Office of Georgia and the Ministry of Agriculture of Georgia, policy documents of the Parliament of Georgia, scientific papers by Georgian and foreign scientists, analytical reports, publications and EU research materials on similar issues. The work estimates the students and graduates employment problems existing in the state development strategy and priorities. The measures to overcome the challenges are defined. The article describes the mechanisms of state regulation of youth employment and the ways of improving this regulatory base. As for major findings, it should be highlighted that the main problems are: lack of experience and incompatibility of youth qualification with the requirements of the labor market. Accordingly, it is concluded that the unemployment rate of young people in Georgia is increasing.

Keywords: migration of youth, youth employment, migration management, youth employment and unemployment

Procedia PDF Downloads 148
36822 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 317
36821 Authenticity from the Perspective of Locals: What Prince Edward Islanders Had to Say about Authentic Tourism Experiences

Authors: Susan C. Graham

Abstract:

Authenticity has grown to be ubiquitous within the tourism vernacular. Yet, agreement regarding what authenticity means in relation to tourism remains nebulous. In its simplest form, authenticity in tourism refers to products and experiences that provide insights into the social, cultural, economic, natural, historical, and political life of a place. But this definition is unwieldy in its scope and may not help industry leaders nor tourist in identifying that which is authentic. Much of what is projected as authentic is a carefully curated and crafted message developed by marketers to appeal to visitors and bears little resemblance to the everyday lives of locals. So perhaps one way to identify authentic tourism experiences is to ask locals themselves. The purpose of this study was to explore the perspectives of locals with respect to what constituted an authentic tourism experience in Prince Edward Island (PEI), Canada. Over 600 volunteers in a tourism research panel were sent a survey asking them to describe authentic PEI experiences within ten sub-categories relevant to the local tourism industry. To make participation more manageable, each respondent was asked their perspectives on any three of the tourism sub-categories. Over 400 individuals responded, providing 1391 unique responses. The responses were grouped thematically using interpretive phenomenological analysis whereby the participants’ responses were clustered into higher order groups to extract meaning. Two interesting thematic observations emerged: first, that respondents tended to clearly articulate and differentiate between intra- versus interpersonal experiences as a means of authentically experiencing PEI; and second, while respondents explicitly valued unstaged experiences over staged, several exceptions to this general rule were expressed. Responses could clearly be grouped into those that emphasized “going off the beaten path,” “exploring pristine and untouched corners,” “lesser known,” “hidden”, “going solo,” and taking the opportunity to “slow down.” Each of these responses was “self” centered, and focused on the visitor discovering and exploring in search of greater self-awareness and inner peace. In contrast, other responses encouraged the interaction of visitors with locals as a means of experiencing the authentic place. Respondents sited “going deep-sea fishing” to learn about local fishers and their communities, stopping by “local farm stands” and speaking with farmers who worked the land for generations,” patronizing “local restaurants, pubs, and b&bs”, and partaking in performances or exhibits by local artists. These kinds of experiences, the respondents claimed, provide an authentic glimpse into a place’s character. The second set of observations focused on the distinction between staged and unstaged experiences, with respondents overwhelmingly advocating for unstaged. Responses were clear in shunning “touristy,” “packaged,” and “fake” offerings for being inauthentic and misrepresenting the place as locals view it. Yet many respondents made exceptions for certain “staged” experiences, including (quite literally) the stage production of Anne of Green Gables based on the novel of the same name, the theatrical re-enactment of the founding of Canada, and visits to PEI’s many provincial and national parks, all of which respondents considered both staged and authentic at the same time.

Keywords: authentic, local, Prince Edward Island, tourism

Procedia PDF Downloads 267
36820 StockTwits Sentiment Analysis on Stock Price Prediction

Authors: Min Chen, Rubi Gupta

Abstract:

Understanding and predicting stock market movements is a challenging problem. It is believed stock markets are partially driven by public sentiments, which leads to numerous research efforts to predict stock market trend using public sentiments expressed on social media such as Twitter but with limited success. Recently a microblogging website StockTwits is becoming increasingly popular for users to share their discussions and sentiments about stocks and financial market. In this project, we analyze the text content of StockTwits tweets and extract financial sentiment using text featurization and machine learning algorithms. StockTwits tweets are first pre-processed using techniques including stopword removal, special character removal, and case normalization to remove noise. Features are extracted from these preprocessed tweets through text featurization process using bags of words, N-gram models, TF-IDF (term frequency-inverse document frequency), and latent semantic analysis. Machine learning models are then trained to classify the tweets' sentiment as positive (bullish) or negative (bearish). The correlation between the aggregated daily sentiment and daily stock price movement is then investigated using Pearson’s correlation coefficient. Finally, the sentiment information is applied together with time series stock data to predict stock price movement. The experiments on five companies (Apple, Amazon, General Electric, Microsoft, and Target) in a duration of nine months demonstrate the effectiveness of our study in improving the prediction accuracy.

Keywords: machine learning, sentiment analysis, stock price prediction, tweet processing

Procedia PDF Downloads 156