Search results for: gradient boosting machine
1686 Interpretation of the Russia-Ukraine 2022 War via N-Gram Analysis
Authors: Elcin Timur Cakmak, Ayse Oguzlar
Abstract:
This study presents the results of the tweets sent by Twitter users on social media about the Russia-Ukraine war by bigram and trigram methods. On February 24, 2022, Russian President Vladimir Putin declared a military operation against Ukraine, and all eyes were turned to this war. Many people living in Russia and Ukraine reacted to this war and protested and also expressed their deep concern about this war as they felt the safety of their families and their futures were at stake. Most people, especially those living in Russia and Ukraine, express their views on the war in different ways. The most popular way to do this is through social media. Many people prefer to convey their feelings using Twitter, one of the most frequently used social media tools. Since the beginning of the war, it is seen that there have been thousands of tweets about the war from many countries of the world on Twitter. These tweets accumulated in data sources are extracted using various codes for analysis through Twitter API and analysed by Python programming language. The aim of the study is to find the word sequences in these tweets by the n-gram method, which is known for its widespread use in computational linguistics and natural language processing. The tweet language used in the study is English. The data set consists of the data obtained from Twitter between February 24, 2022, and April 24, 2022. The tweets obtained from Twitter using the #ukraine, #russia, #war, #putin, #zelensky hashtags together were captured as raw data, and the remaining tweets were included in the analysis stage after they were cleaned through the preprocessing stage. In the data analysis part, the sentiments are found to present what people send as a message about the war on Twitter. Regarding this, negative messages make up the majority of all the tweets as a ratio of %63,6. Furthermore, the most frequently used bigram and trigram word groups are found. Regarding the results, the most frequently used word groups are “he, is”, “I, do”, “I, am” for bigrams. Also, the most frequently used word groups are “I, do, not”, “I, am, not”, “I, can, not” for trigrams. In the machine learning phase, the accuracy of classifications is measured by Classification and Regression Trees (CART) and Naïve Bayes (NB) algorithms. The algorithms are used separately for bigrams and trigrams. We gained the highest accuracy and F-measure values by the NB algorithm and the highest precision and recall values by the CART algorithm for bigrams. On the other hand, the highest values for accuracy, precision, and F-measure values are achieved by the CART algorithm, and the highest value for the recall is gained by NB for trigrams.Keywords: classification algorithms, machine learning, sentiment analysis, Twitter
Procedia PDF Downloads 731685 Probabilistic Approach to the Spatial Identification of the Environmental Sources behind Mortality Rates in Europe
Authors: Alina Svechkina, Boris A. Portnov
Abstract:
In line with a rapid increase in pollution sources and enforcement of stricter air pollution regulation, which lowers pollution levels, it becomes more difficult to identify actual risk sources behind the observed morbidity patterns, and new approaches are required to identify potential risks and take preventive actions. In the present study, we discuss a probabilistic approach to the spatial identification of a priori unidentified environmental health hazards. The underlying assumption behind the tested approach is that the observed adverse health patterns (morbidity, mortality) can become a source of information on the geographic location of environmental risk factors that stand behind them. Using this approach, we analyzed sources of environmental exposure using data on mortality rates available for the year 2015 for NUTS 3 (Nomenclature of Territorial Units for Statistics) subdivisions of the European Union. We identified several areas in the southwestern part of Europe as primary risk sources for the observed mortality patterns. Multivariate regressions, controlled by geographical location, climate conditions, GDP (gross domestic product) per capita, dependency ratios, population density, and the level of road freight revealed that mortality rates decline as a function of distance from the identified hazard location. We recommend the proposed approach an exploratory analysis tool for initial investigation of regional patterns of population morbidity patterns and factors behind it.Keywords: mortality, environmental hazards, air pollution, distance decay gradient, multi regression analysis, Europe, NUTS3
Procedia PDF Downloads 1661684 Liver and Liver Lesion Segmentation From Abdominal CT Scans
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithmKeywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm
Procedia PDF Downloads 4491683 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques
Authors: Masoomeh Alsadat Mirshafaei
Abstract:
The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest
Procedia PDF Downloads 361682 Effect of External Radiative Heat Flux on Combustion Characteristics of Rigid Polyurethane Foam under Piloted-Ignition and Radiative Auto-Ignition Modes
Authors: Jia-Jia He, Lin Jiang, Jin-Hua Sun
Abstract:
Rigid polyurethane foam (RPU) has been extensively applied in building insulation system, yet with high flammability for being easily ignited by high temperature spark or radiative heat flux from other flaming materials or surrounding building facade. Using a cone calorimeter by Fire Testing Technology and thermal couple tree, this study systematically investigated the effect of radiative heat flux on the ignition time and characteristic temperature distribution during RPU combustion under different heat fluxes gradient (12, 15, 20, 25, 30, 35, 40, 45, and 50 kW/m²) with spark ignition/ignition by radiation. The ignition time decreases proportionally with increase of external heat flux, meanwhile increasing the external heat flux raises the peak heat release rate and impresses on the vertical temperature distribution greatly. The critical ignition heat flux is found to be 15 and 25 kW/m² for spark ignition and radiative ignition, respectively. Based on previous experienced ignition formula, a methodology to predict ignition times in both modes has been developed theoretically. By analyzing the heat transfer mechanism around the sample surroundings, both radiation from cone calorimeter and convection flow are considered and calculated theoretically. The experimental ignition times agree well with the theoretical ones in both radiative and convective conditions; however, the observed critical ignition heat flux is higher than the calculated one under piloted-ignition mode because the heat loss process, especially in lower heat flux radiation, is not considered in this developed methodology.Keywords: rigid polyurethane foam, cone calorimeter, ignition time, external heat flux
Procedia PDF Downloads 2061681 InSAR Times-Series Phase Unwrapping for Urban Areas
Authors: Hui Luo, Zhenhong Li, Zhen Dong
Abstract:
The analysis of multi-temporal InSAR (MTInSAR) such as persistent scatterer (PS) and small baseline subset (SBAS) techniques usually relies on temporal/spatial phase unwrapping (PU). Unfortunately, it always fails to unwrap the phase for two reasons: 1) spatial phase jump between adjacent pixels larger than π, such as layover and high discontinuous terrain; 2) temporal phase discontinuities such as time varied atmospheric delay. To overcome these limitations, a least-square based PU method is introduced in this paper, which incorporates baseline-combination interferograms and adjacent phase gradient network. Firstly, permanent scatterers (PS) are selected for study. Starting with the linear baseline-combination method, we obtain equivalent 'small baseline inteferograms' to limit the spatial phase difference. Then, phase different has been conducted between connected PSs (connected by a specific networking rule) to suppress the spatial correlated phase errors such as atmospheric artifact. After that, interval phase difference along arcs can be computed by least square method and followed by an outlier detector to remove the arcs with phase ambiguities. Then, the unwrapped phase can be obtained by spatial integration. The proposed method is tested on real data of TerraSAR-X, and the results are also compared with the ones obtained by StaMPS(a software package with 3D PU capabilities). By comparison, it shows that the proposed method can successfully unwrap the interferograms in urban areas even when high discontinuities exist, while StaMPS fails. At last, precise DEM errors can be got according to the unwrapped interferograms.Keywords: phase unwrapping, time series, InSAR, urban areas
Procedia PDF Downloads 1471680 On Improving Breast Cancer Prediction Using GRNN-CP
Authors: Kefaya Qaddoum
Abstract:
The aim of this study is to predict breast cancer and to construct a supportive model that will stimulate a more reliable prediction as a factor that is fundamental for public health. In this study, we utilize general regression neural networks (GRNN) to replace the normal predictions with prediction periods to achieve a reasonable percentage of confidence. The mechanism employed here utilises a machine learning system called conformal prediction (CP), in order to assign consistent confidence measures to predictions, which are combined with GRNN. We apply the resulting algorithm to the problem of breast cancer diagnosis. The results show that the prediction constructed by this method is reasonable and could be useful in practice.Keywords: neural network, conformal prediction, cancer classification, regression
Procedia PDF Downloads 2901679 Regional Pole Placement by Saturated Power System Stabilizers
Authors: Hisham M. Soliman, Hassan Yousef
Abstract:
This manuscript presents new results on design saturated power system stabilizers (PSS) to assign system poles within a desired region for achieving good dynamic performance. The regional pole placement is accomplished against model uncertainties caused by different load conditions. The design is based on a sufficient condition in the form of linear matrix inequalities (LMI) which forces the saturated nonlinear controller to lie within the linear zone. The controller effectiveness is demonstrated on a single machine infinite bus system.Keywords: power system stabilizer, saturated control, robust control, regional pole placement, linear matrix inequality (LMI)
Procedia PDF Downloads 5621678 Solving the Economic Load Dispatch Problem Using Differential Evolution
Authors: Alaa Sheta
Abstract:
Economic Load Dispatch (ELD) is one of the vital optimization problems in power system planning. Solving the ELD problems mean finding the best mixture of power unit outputs of all members of the power system network such that the total fuel cost is minimized while sustaining operation requirements limits satisfied across the entire dispatch phases. Many optimization techniques were proposed to solve this problem. A famous one is the Quadratic Programming (QP). QP is a very simple and fast method but it still suffer many problem as gradient methods that might trapped at local minimum solutions and cannot handle complex nonlinear functions. Numbers of metaheuristic algorithms were used to solve this problem such as Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). In this paper, another meta-heuristic search algorithm named Differential Evolution (DE) is used to solve the ELD problem in power systems planning. The practicality of the proposed DE based algorithm is verified for three and six power generator system test cases. The gained results are compared to existing results based on QP, GAs and PSO. The developed results show that differential evolution is superior in obtaining a combination of power loads that fulfill the problem constraints and minimize the total fuel cost. DE found to be fast in converging to the optimal power generation loads and capable of handling the non-linearity of ELD problem. The proposed DE solution is able to minimize the cost of generated power, minimize the total power loss in the transmission and maximize the reliability of the power provided to the customers.Keywords: economic load dispatch, power systems, optimization, differential evolution
Procedia PDF Downloads 2811677 Switching Losses in Power Electronic Converter of Switched Reluctance Motor
Authors: Ali Asghar Memon
Abstract:
A cautious and astute selection of switching devices used in power electronic converters of a switched reluctance (SR) motor is required. It is a matter of choice of best switching devices with respect to their switching ability rather than fulfilling the number of switches. This paper highlights the computational determination of switching losses comprising of switch-on, switch-off and conduction losses respectively by using experimental data in simulation model of a SR machine. The finding of this research is helpful for proper selection of electronic switches and suitable converter topology for switched reluctance motor.Keywords: converter, operating modes, switched reluctance motor, switching losses
Procedia PDF Downloads 5031676 Knitting Stitches’ Manipulation for Catenary Textile Structures
Authors: Virginia Melnyk
Abstract:
This paper explores the design for catenary structure using knitted textiles. Using the advantages of Grasshopper and Kangaroo parametric software to simulate and pre-design an overall form, the design is then translated to a pattern that can be made with hand manipulated stitches on a knitting machine. The textile takes advantage of the structure of knitted materials and the ability for it to stretch. Using different types of stitches to control the amount of stretch that can occur in portions of the textile generates an overall formal design. The textile is then hardened in an upside-down hanging position and then flipped right-side-up. This then becomes a structural catenary form. The resulting design is used as a small Cat House for a cat to sit inside and climb on top of.Keywords: architectural materials, catenary structures, knitting fabrication, textile design
Procedia PDF Downloads 1791675 The Effect of Corporate Governance to Islamic Banking Performance Using Maqasid Index Approach in Indonesia
Authors: Audia Syafa'atur Rahman, Rozali Haron
Abstract:
The practices of Islamic banking are more attuned to the goals of profit maximization rather than obtaining ethical profit. Ethical profit is obtained from interest-free earnings and to give an impact which benefits to the growth of society and economy. Good corporate governance practices are needed to assure the sustainability of Islamic banks in order to achieve Maqasid Shariah with the main purpose of boosting the well-being of people. The Maqasid Shariah performance measurement is used to measure the duties and responsibilities expected to be performed by Islamic banks. It covers not only unification dimension like financial measurement, but also many dimensions covered to reflect the main purpose of Islamic banks. The implementation of good corporate governance is essential because it covers the interests of the stakeholders and facilitates effective monitoring to encourage Islamic banks to utilize resources more efficiently in order to achieve the Maqasid Shariah. This study aims to provide the empirical evidence on the Maqasid performance of Islamic banks in relation to the Maqasid performance evaluation model, to examine the influence of SSB characteristics and board structures to Islamic Banks performance as measured by Maqasid performance evaluation model. By employing the simple additive weighting method, Maqasid index for all the Islamic Banks in Indonesia within 2012 to 2016 ranged from above 11% to 28%. The Maqasid Syariah performance index where results reached above 20% are obtained by Islamic Banks such as Bank Muamalat Indonesia, Bank Panin Syariah, and Bank BRI Syariah. The consistent achievement above 23% is achieved by BMI. Other Islamic Banks such as Bank Victoria Syariah, Bank Jabar Banten Syariah, Bank BNI Syariah, Bank Mega Syariah, BCA Syariah, and Maybank Syariah Indonesia shows a fluctuating value of the Maqasid performance index every year. The impact of SSB characteristics and board structures are tested using random-effects generalized least square. The findings indicate that SSB characteristics (Shariah Supervisory Board size, Shariah Supervisory Board cross membership, Shariah Supervisory Board Education, and Shariah Supervisory Board reputation) and board structures (Board size and Board independence) have an essential role in improving the performance of Islamic Banks. The findings denote Shariah Supervisory Board with smaller size, higher portion of Shariah Supervisory Board cross membership; lesser Shariah Supervisory Board holds doctorate degree, lesser reputable scholar, more members on board of directors, and less independence non-executive directors will enhance the performance of Islamic Banks.Keywords: Maqasid Shariah, corporate governance, Islamic banks, Shariah supervisory board
Procedia PDF Downloads 2391674 Optimal Image Representation for Linear Canonical Transform Multiplexing
Authors: Navdeep Goel, Salvador Gabarda
Abstract:
Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4x4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4*4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4*4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.Keywords: chirp signals, image multiplexing, image transformation, linear canonical transform, polynomial approximation
Procedia PDF Downloads 4101673 Awarding Copyright Protection to Artificial Intelligence Technology for its Original Works: The New Way Forward
Authors: Vibhuti Amarnath Madhu Agrawal
Abstract:
Artificial Intelligence (AI) and Intellectual Property are two emerging concepts that are growing at a fast pace and have the potential of having a huge impact on the economy in the coming times. In simple words, AI is nothing but work done by a machine without any human intervention. It is a coded software embedded in a machine, which over a period of time, develops its own intelligence and begins to take its own decisions and judgments by studying various patterns of how people think, react to situations and perform tasks, among others. Intellectual Property, especially Copyright Law, on the other hand, protects the rights of individuals and Companies in content creation that primarily deals with application of intellect, originality and expression of the same in some tangible form. According to some of the reports shared by the media lately, ChatGPT, an AI powered Chatbot, has been involved in the creation of a wide variety of original content, including but not limited to essays, emails, plays and poetry. Besides, there have been instances wherein AI technology has given creative inputs for background, lights and costumes, among others, for films. Copyright Law offers protection to all of these different kinds of content and much more. Considering the two key parameters of Copyright – application of intellect and originality, the question, therefore, arises that will awarding Copyright protection to a person who has not directly invested his / her intellect in the creation of that content go against the basic spirit of Copyright laws? This study aims to analyze the current scenario and provide answers to the following questions: a. If the content generated by AI technology satisfies the basic criteria of originality and expression in a tangible form, why should such content be denied protection in the name of its creator, i.e., the specific AI tool / technology? B. Considering the increasing role and development of AI technology in our lives, should it be given the status of a ‘Legal Person’ in law? C. If yes, what should be the modalities of awarding protection to works of such Legal Person and management of the same? Considering the current trends and the pace at which AI is advancing, it is not very far when AI will start functioning autonomously in the creation of new works. Current data and opinions on this issue globally reflect that they are divided and lack uniformity. In order to fill in the existing gaps, data obtained from Copyright offices from the top economies of the world have been analyzed. The role and functioning of various Copyright Societies in these countries has been studied in detail. This paper provides a roadmap that can be adopted to satisfy various objectives, constraints and dynamic conditions related AI technology and its protection under Copyright Law.Keywords: artificial intelligence technology, copyright law, copyright societies, intellectual property
Procedia PDF Downloads 701672 Depth Estimation in DNN Using Stereo Thermal Image Pairs
Authors: Ahmet Faruk Akyuz, Hasan Sakir Bilge
Abstract:
Depth estimation using stereo images is a challenging problem in computer vision. Many different studies have been carried out to solve this problem. With advancing machine learning, tackling this problem is often done with neural network-based solutions. The images used in these studies are mostly in the visible spectrum. However, the need to use the Infrared (IR) spectrum for depth estimation has emerged because it gives better results than visible spectra in some conditions. At this point, we recommend using thermal-thermal (IR) image pairs for depth estimation. In this study, we used two well-known networks (PSMNet, FADNet) with minor modifications to demonstrate the viability of this idea.Keywords: thermal stereo matching, deep neural networks, CNN, Depth estimation
Procedia PDF Downloads 2761671 Mg and MgN₃ Cluster in Diamond: Quantum Mechanical Studies
Authors: T. S. Almutairi, Paul May, Neil Allan
Abstract:
The geometrical, electronic and magnetic properties of the neutral Mg center and MgN₃ cluster in diamond have been studied theoretically in detail by means of an HSE06 Hamiltonian that includes a fraction of the exact exchange term; this is important for a satisfactory picture of the electronic states of open-shell systems. Another batch of the calculations by GGA functionals have also been included for comparison, and these support the results from HSE06. The local perturbations in the lattice by introduced Mg defect are restricted in the first and second shell of atoms before eliminated. The formation energy calculated with HSE06 and GGA of single Mg agrees with the previous result. We found the triplet state with C₃ᵥ is the ground state of Mg center with energy lower than the singlet with C₂ᵥ by ~ 0.1 eV. The recent experimental ZPL (557.4 nm) of Mg center in diamond has been discussed in the view of present work. The analysis of the band-structure of the MgN₃ cluster confirms that the MgN₃ defect introduces a shallow donor level in the gap lying within the conduction band edge. This observation is supported by the EMM that produces n-type levels shallower than the P donor level. The formation energy of MgN₂ calculated from a 2NV defect (~ 3.6 eV) is a promising value from which to engineer MgN₃ defects inside the diamond. Ion-implantation followed by heating to about 1200-1600°C might induce migration of N related defects to the localized Mg center. Temperature control is needed for this process to restore the damage and ensure the mobilities of V and N, which demands a more precise experimental study.Keywords: empirical marker method, generalised gradient approximation, Heyd–Scuseria–Ernzerhof screened hybrid functional, zero phono line
Procedia PDF Downloads 1141670 Introduction to Multi-Agent Deep Deterministic Policy Gradient
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents
Procedia PDF Downloads 211669 Market Index Trend Prediction using Deep Learning and Risk Analysis
Authors: Shervin Alaei, Reza Moradi
Abstract:
Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks
Procedia PDF Downloads 1531668 Temperature Distribution for Asphalt Concrete-Concrete Composite Pavement
Authors: Tetsya Sok, Seong Jae Hong, Young Kyu Kim, Seung Woo Lee
Abstract:
The temperature distribution for asphalt concrete (AC)-Concrete composite pavement is one of main influencing factor that affects to performance life of pavement. The temperature gradient in concrete slab underneath the AC layer results the critical curling stress and lead to causes de-bonding of AC-Concrete interface. These stresses, when enhanced by repetitive axial loadings, also contribute to the fatigue damage and eventual crack development within the slab. Moreover, the temperature change within concrete slab extremely causes the slab contracts and expands that significantly induces reflective cracking in AC layer. In this paper, the numerical prediction of pavement temperature was investigated using one-dimensional finite different method (FDM) in fully explicit scheme. The numerical predicted model provides a fundamental and clear understanding of heat energy balance including incoming and outgoing thermal energies in addition to dissipated heat in the system. By using the reliable meteorological data for daily air temperature, solar radiation, wind speech and variable pavement surface properties, the predicted pavement temperature profile was validated with the field measured data. Additionally, the effects of AC thickness and daily air temperature on the temperature profile in underlying concrete were also investigated. Based on obtained results, the numerical predicted temperature of AC-Concrete composite pavement using FDM provided a good accuracy compared to field measured data and thicker AC layer significantly insulates the temperature distribution in underlying concrete slab.Keywords: asphalt concrete, finite different method (FDM), curling effect, heat transfer, solar radiation
Procedia PDF Downloads 2671667 Application of Vector Representation for Revealing the Richness of Meaning of Facial Expressions
Authors: Carmel Sofer, Dan Vilenchik, Ron Dotsch, Galia Avidan
Abstract:
Studies investigating emotional facial expressions typically reveal consensus among observes regarding the meaning of basic expressions, whose number ranges between 6 to 15 emotional states. Given this limited number of discrete expressions, how is it that the human vocabulary of emotional states is so rich? The present study argues that perceivers use sequences of these discrete expressions as the basis for a much richer vocabulary of emotional states. Such mechanisms, in which a relatively small number of basic components is expanded to a much larger number of possible combinations of meanings, exist in other human communications modalities, such as spoken language and music. In these modalities, letters and notes, which serve as basic components of spoken language and music respectively, are temporally linked, resulting in the richness of expressions. In the current study, in each trial participants were presented with sequences of two images containing facial expression in different combinations sampled out of the eight static basic expressions (total 64; 8X8). In each trial, using single word participants were required to judge the 'state of mind' portrayed by the person whose face was presented. Utilizing word embedding methods (Global Vectors for Word Representation), employed in the field of Natural Language Processing, and relying on machine learning computational methods, it was found that the perceived meanings of the sequences of facial expressions were a weighted average of the single expressions comprising them, resulting in 22 new emotional states, in addition to the eight, classic basic expressions. An interaction between the first and the second expression in each sequence indicated that every single facial expression modulated the effect of the other facial expression thus leading to a different interpretation ascribed to the sequence as a whole. These findings suggest that the vocabulary of emotional states conveyed by facial expressions is not restricted to the (small) number of discrete facial expressions. Rather, the vocabulary is rich, as it results from combinations of these expressions. In addition, present research suggests that using word embedding in social perception studies, can be a powerful, accurate and efficient tool, to capture explicit and implicit perceptions and intentions. Acknowledgment: The study was supported by a grant from the Ministry of Defense in Israel to GA and CS. CS is also supported by the ABC initiative in Ben-Gurion University of the Negev.Keywords: Glove, face perception, facial expression perception. , facial expression production, machine learning, word embedding, word2vec
Procedia PDF Downloads 1751666 Music Listening in Dementia: Current Developments and the Potential for Automated Systems in the Home: Scoping Review and Discussion
Authors: Alexander Street, Nina Wollersberger, Paul Fernie, Leonardo Muller, Ming Hung HSU, Helen Odell-Miller, Jorg Fachner, Patrizia Di Campli San Vito, Stephen Brewster, Hari Shaji, Satvik Venkatesh, Paolo Itaborai, Nicolas Farina, Alexis Kirke, Sube Banerjee, Eduardo Reck Miranda
Abstract:
Escalating neuropsychiatric symptoms (NPS) in people with dementia may lead to earlier care home admission. Music listening has been reported to stimulate cognitive function, potentially reducing agitation in this population. We present a scoping review, reporting on current developments and discussing the potential for music listening with related technology in managing agitation in dementia care. Of two searches for music listening studies, one focused on older people or people living with dementia where music listening interventions, including technology, were delivered in participants’ homes or in institutions to address neuropsychiatric symptoms, quality of life and independence. The second included any population focusing on the use of music technology for health and wellbeing. In search one 70/251 full texts were included. The majority reported either statistical significance (6, 8.5%), significance (17, 24.2%) or improvements (26, 37.1%). Agitation was specifically reported in 36 (51.4%). The second search included 51/99 full texts, reporting improvement (28, 54.9%), significance (11, 21.5%), statistical significance (1, 1.9%) and no difference compared to the control (6, 11.7%). The majority in the first focused on mood and agitation, and the second on mood and psychophysiological responses. Five studies used AI or machine learning systems to select music, all involving healthy controls and reporting benefits. Most studies in both reviews were not conducted in a home environment (review 1 = 12; 17.1%; review 2 = 11; 21.5%). Preferred music listening may help manage NPS in the care home settings. Based on these and other data extracted in the review, a reasonable progression would be to co-design and test music listening systems and protocols for NPS in all settings, including people’s homes. Machine learning and automated technology for music selection and arousal adjustment, driven by live biodata, have not been explored in dementia care. Such approaches may help deliver the right music at the appropriate time in the required dosage, reducing the use of medication and improving quality of life.Keywords: music listening, dementia, agitation, scoping review, technology
Procedia PDF Downloads 1121665 Effects of Channel Orientation on Heat Transfer in a Rotating Rectangular Channel with Jet Impingement Cooling and Film Coolant Extraction
Authors: Hua Li, Hongwu Deng
Abstract:
The turbine blade's leading edge is usually cooled by jet impingement cooling technology due to the heaviest heat load. For a rotating turbine blade, however, the channel orientation (β, the angle between the jet direction and the rotating plane) could play an important role in influencing the flow field and heat transfer. Therefore, in this work, the effects of channel orientation (from 90° to 180°) on heat transfer in a jet impingement cooling channel are experimentally investigated. Furthermore, the investigations are conducted under an isothermal boundary condition. Both the jet-to-target surface distance and jet-to-jet spacing are three times the jet hole diameter. The jet Reynolds number is 5,000, and the maximum jet rotation number reaches 0.24. The results show that the rotation-induced variations of heat transfer are different in each channel orientation. In the cases of 90°≤β≤135°, a vortex generated in the low-radius region of the supply channel changes the mass-flowrate distribution in each jet hole. Therefore, the heat transfer in the low-radius region decreases with the rotation number, whereas the heat transfer in the high-radius region increases, indicating that a larger temperature gradient in the radial direction could appear in the turbine blade's leading edge. When 135°<β≤180°; however, the heat transfer of the entire stagnant zone decreases with the rotation number. The rotation-induced jet deflection is the primary factor that weakens the heat transfer, and jets cannot reach the target surface at high rotation numbers. For the downstream regions, however, the heat transfer is enhanced by 50%-80% in every channel orientation because the dead zone is broken by the rotation-induced secondary flow in the impingement channel.Keywords: heat transfer, jet impingement cooling, channel orientation, high rotation number, isothermal boundary
Procedia PDF Downloads 1041664 Deep Learning to Enhance Mathematics Education for Secondary Students in Sri Lanka
Authors: Selvavinayagan Babiharan
Abstract:
This research aims to develop a deep learning platform to enhance mathematics education for secondary students in Sri Lanka. The platform will be designed to incorporate interactive and user-friendly features to engage students in active learning and promote their mathematical skills. The proposed platform will be developed using TensorFlow and Keras, two widely used deep learning frameworks. The system will be trained on a large dataset of math problems, which will be collected from Sri Lankan school curricula. The results of this research will contribute to the improvement of mathematics education in Sri Lanka and provide a valuable tool for teachers to enhance the learning experience of their students.Keywords: information technology, education, machine learning, mathematics
Procedia PDF Downloads 781663 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater
Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj
Abstract:
In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation
Procedia PDF Downloads 681662 Theoretical Analysis of the Existing Sheet Thickness in the Calendering of Pseudoplastic Material
Authors: Muhammad Zahid
Abstract:
The mechanical process of smoothing and compressing a molten material by passing it through a number of pairs of heated rolls in order to produce a sheet of desired thickness is called calendering. The rolls that are in combination are called calenders, a term derived from kylindros the Greek word for the cylinder. It infects the finishing process used on cloth, paper, textiles, leather cloth, or plastic film and so on. It is a mechanism which is used to strengthen surface properties, minimize sheet thickness, and yield special effects such as a glaze or polish. It has a wide variety of applications in industries in the manufacturing of textile fabrics, coated fabrics, and plastic sheeting to provide the desired surface finish and texture. An analysis has been presented for the calendering of Pseudoplastic material. The lubrication approximation theory (LAT) has been used to simplify the equations of motion. For the investigation of the nature of the steady solutions that exist, we make use of the combination of exact solution and numerical methods. The expressions for the velocity profile, rate of volumetric flow and pressure gradient are found in the form of exact solutions. Furthermore, the quantities of interest by engineering point of view, such as pressure distribution, roll-separating force, and power transmitted to the fluid by the rolls are also computed. Some results are shown graphically while others are given in the tabulated form. It is found that the non-Newtonian parameter and Reynolds number serve as the controlling parameters for the calendering process.Keywords: calendering, exact solutions, lubrication approximation theory, numerical solutions, pseudoplastic material
Procedia PDF Downloads 1461661 Makhraj Recognition Using Convolutional Neural Network
Authors: Zan Azma Nasruddin, Irwan Mazlin, Nor Aziah Daud, Fauziah Redzuan, Fariza Hanis Abdul Razak
Abstract:
This paper focuses on a machine learning that learn the correct pronunciation of Makhraj Huroofs. Usually, people need to find an expert to pronounce the Huroof accurately. In this study, the researchers have developed a system that is able to learn the selected Huroofs which are ha, tsa, zho, and dza using the Convolutional Neural Network. The researchers present the chosen type of the CNN architecture to make the system that is able to learn the data (Huroofs) as quick as possible and produces high accuracy during the prediction. The researchers have experimented the system to measure the accuracy and the cross entropy in the training process.Keywords: convolutional neural network, Makhraj recognition, speech recognition, signal processing, tensorflow
Procedia PDF Downloads 3331660 Opto-Electronic Properties and Structural Phase Transition of Filled-Tetrahedral NaZnAs
Authors: R. Khenata, T. Djied, R. Ahmed, H. Baltache, S. Bin-Omran, A. Bouhemadou
Abstract:
We predict structural, phase transition as well as opto-electronic properties of the filled-tetrahedral (Nowotny-Juza) NaZnAs compound in this study. Calculations are carried out by employing the full potential (FP) linearized augmented plane wave (LAPW) plus local orbitals (lo) scheme developed within the structure of density functional theory (DFT). Exchange-correlation energy/potential (EXC/VXC) functional is treated using Perdew-Burke and Ernzerhof (PBE) parameterization for generalized gradient approximation (GGA). In addition to Trans-Blaha (TB) modified Becke-Johnson (mBJ) potential is incorporated to get better precision for optoelectronic properties. Geometry optimization is carried out to obtain the reliable results of the total energy as well as other structural parameters for each phase of NaZnAs compound. Order of the structural transitions as a function of pressure is found as: Cu2Sb type → β → α phase in our study. Our calculated electronic energy band structures for all structural phases at the level of PBE-GGA as well as mBJ potential point out; NaZnAs compound is a direct (Γ–Γ) band gap semiconductor material. However, as compared to PBE-GGA, mBJ potential approximation reproduces higher values of fundamental band gap. Regarding the optical properties, calculations of real and imaginary parts of the dielectric function, refractive index, reflectivity coefficient, absorption coefficient and energy loss-function spectra are performed over a photon energy ranging from 0.0 to 30.0 eV by polarizing incident radiation in parallel to both [100] and [001] crystalline directions.Keywords: NaZnAs, FP-LAPW+lo, structural properties, phase transition, electronic band-structure, optical properties
Procedia PDF Downloads 4351659 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times
Authors: Ali Allahverdi
Abstract:
The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan
Procedia PDF Downloads 3381658 Natural Language Processing for the Classification of Social Media Posts in Post-Disaster Management
Authors: Ezgi Şendil
Abstract:
Information extracted from social media has received great attention since it has become an effective alternative for collecting people’s opinions and emotions based on specific experiences in a faster and easier way. The paper aims to put data in a meaningful way to analyze users’ posts and get a result in terms of the experiences and opinions of the users during and after natural disasters. The posts collected from Reddit are classified into nine different categories, including injured/dead people, infrastructure and utility damage, missing/found people, donation needs/offers, caution/advice, and emotional support, identified by using labelled Twitter data and four different machine learning (ML) classifiers.Keywords: disaster, NLP, postdisaster management, sentiment analysis
Procedia PDF Downloads 741657 The Financial Impact of Covid 19 on the Hospitality Industry in New Zealand
Authors: Kay Fielden, Eelin Tan, Lan Nguyen
Abstract:
In this research project, data was gathered at a Covid 19 Conference held in June 2021 from industry leaders who discussed the impact of the global pandemic on the status of the New Zealand hospitality industry. Panel discussions on financials, human resources, health and safety, and recovery were conducted. The themes explored for the finance panel were customer demographics, hospitality sectors, financial practices, government impact, and cost of compliance. The aim was to see how the hospitality industry has responded to the global pandemic and the steps that have been taken for the industry to recover or sustain their business. The main research question for this qualitative study is: What are the factors that have impacted on finance for the hospitality industry in New Zealand due to Covid 19? For financials, literature has been gathered to study global effects, and this is being compared with the data gathered from the discussion panel through the lens of resilience theory. Resilience theory applied to the hospitality industry suggests that the challenges imposed by Covid 19 have been the catalyst for government initiatives, technical innovation, engaging local communities, and boosting confidence. Transformation arising from these ground shifts have been a move towards sustainability, wellbeing, more awareness of climate change, and community engagement. Initial findings suggest that there has been a shift in customer base that has prompted regional accommodation providers to realign offers and to become more flexible to attract and maintain this realigned customer base. Dynamic pricing structures have been required to meet changing customer demographics. Flexible staffing arrangements include sharing staff between different accommodation providers, owners with multiple properties adopting different staffing arrangements, maintaining a good working relationship with the bank, and conserving cash. Uncertain times necessitate changing revenue strategies to cope with external factors. Financial support offered by the government has cushioned the financial downturn for many in the hospitality industry, and managed isolation and quarantine (MIQ) arrangements have offered immediate financial relief for those hotels involved. However, there is concern over the long-term effects. Compliance with mandated health and safety requirements has meant that the hospitality industry has streamlined its approach to meeting those requirements and has invested in customer relations to keep paying customers informed of the health measures in place. Initial findings from this study lie within the resilience theory framework and are consistent with findings from the literature.Keywords: global pandemic, hospitality industry, new Zealand, resilience
Procedia PDF Downloads 100