Search results for: time series data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38099

Search results for: time series data mining

35129 The Effect of Sowing Time on Phytopathogenic Characteristics and Yield of Sunflower Hybrids

Authors: Adrienn Novák

Abstract:

The field research was carried out at the Látókép AGTC KIT research area of the University of Debrecen in Eastern-Hungary, on the area of the aeolain loess of the Hajdúság. We examined the effects of the sowing time on the phytopathogenic characteristics and yield production by applying various fertilizer treatments on two different sunflower genotypes (NK Ferti, PR64H42) in 2012 and 2013. We applied three different sowing times (early, optimal, late) and two different treatment levels of fungicides (control = no fungicides applied, double fungicide protection). During our investigations, the studied cropyears were of different sowing time optimum in terms of yield amount (2012: early, 2013: average). By Pearson’s correlation analysis, we have found that delaying the sowing time pronouncedly decreased the extent of infection in both crop years (Diaporthe: r=0.663**, r=0.681**, Sclerotinia: r=0.465**, r=0.622**). The fungicide treatment not only decreased the extent of infection, but had yield increasing effect too (2012: r=0.498**, 2013: r=0.603**). In 2012, delaying of the sowing time increased (r=0.600**), but in 2013, it decreased (r= 0.356*) the yield amount.

Keywords: fungicide treatment, genotypes, sowing time, yield, sunflower

Procedia PDF Downloads 195
35128 The Colorectal Cancer in Patients of Eastern Algeria

Authors: S. Tebibel, C. Mechati, S. Messaoudi

Abstract:

Algeria is currently experiencing the same rate of cancer progression as that registered these last years in the western countries. Colorectal cancer, constituting increasingly a major public health problem, is the most common form of cancer after breast and Neck-womb cancer at the woman and prostate cancer at the man. Our work is based on a retrospective study to determine the cases of colorectal cancer through eastern Algeria. Our goal is to carry out an epidemiological, histological and immune- histochemical study to investigate different techniques for the diagnosis of colorectal cancer and their interests and specific in detecting the disease. The study includes 110 patients (aged between 20 to 87 years) with colorectal cancer where the inclusions and exclusions criteria were established. In our study, colorectal cancer, expresses a male predominance, with a sex ratio of 1, 99 and the most affected age group is between 50 and 59 years. We noted that the colon cancer rate is higher than rectal cancer rate, whose frequencies are respectively 60,91 % and 39,09 %. In the series of colon cancer, the ADK lieberkunien is histological the most represented type, or 85,07 % of all cases. In contrast, the proportion of ADK mucinous (colloid mucous) is only 1,49% only. Well-differentiated ADKS, are very significant in our series, they represent 83,58 % of cases. Adenocarcinoma moderately and poorly differentiated, whose proportions are respectively 2,99 % and 0.05 %. For histological varieties of rectal ADK, we see in our workforce that ADK lieberkunien represent the most common histological form, or 76,74%, while the mucosal colloid is 13,95 %. Research of the mutation on the gene encoding K-ras, a major step in the targeted therapy of colorectal cancers, is underway in our study. Colorectal cancer is the subject of much promising research concern: the evaluation of new therapies (antiangiogenic monoclonal antibodies), the search for predictors of sensitivity to chemotherapy and new prognostic markers using techniques of molecular biology and proteomics.

Keywords: adenocarcinoma, age, colorectal cancer, epidemiology, histological section, sex

Procedia PDF Downloads 328
35127 A Method to Estimate Wheat Yield Using Landsat Data

Authors: Zama Mahmood

Abstract:

The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.

Keywords: landsat, NDVI, remote sensing, satellite images, yield

Procedia PDF Downloads 317
35126 Effect of Addition Rate of Expansive Additive on Autogenous Shrinkage and Delayed Expansion of Ultra-High Strength Mortar

Authors: Yulu Zhang, Atushi Teramoto, Taka-Aki Ohkubo

Abstract:

In this study, the effect of expansive additives on autogenous shrinkage and delayed expansion of ultra-high strength mortar was explored. The specimens made for the study were composed of ultra-high strength mortar, which was mixed with ettringite-lime composite type expansive additive. Two series of experiments were conducted with the specimens. The experimental results confirmed that the autogenous shrinkage of specimens was effectively decreased by increasing the proportion of the expansive additive. On the other hand, for the specimens, which had 7% expansive additive, and were cured for seven days at a constant temperature of 20°C, and then cured for a long time in either in an underwater, moist (Relative humidity: 100%) or dry air (Relative humidity: 60%) environment, excessively large expansion strain occurred. Specifically, typical turtle shell-like swelling expansion cracks were confirmed in the specimens that underwent long-term curing in an underwater and moist environment. According to the result of hydration analysis, the formation of expansive substances, calcium hydroxide and alumina, ferric oxide, tri-sulfate contribute to the occurrence of delayed expansion.

Keywords: ultra-high strength mortar, expansive additive, autogenous shrinkage, delayed expansion

Procedia PDF Downloads 226
35125 Assessing of Social Comfort of the Russian Population with Big Data

Authors: Marina Shakleina, Konstantin Shaklein, Stanislav Yakiro

Abstract:

The digitalization of modern human life over the last decade has facilitated the acquisition, storage, and processing of data, which are used to detect changes in consumer preferences and to improve the internal efficiency of the production process. This emerging trend has attracted academic interest in the use of big data in research. The study focuses on modeling the social comfort of the Russian population for the period 2010-2021 using big data. Big data provides enormous opportunities for understanding human interactions at the scale of society with plenty of space and time dynamics. One of the most popular big data sources is Google Trends. The methodology for assessing social comfort using big data involves several steps: 1. 574 words were selected based on the Harvard IV-4 Dictionary adjusted to fit the reality of everyday Russian life. The set of keywords was further cleansed by excluding queries consisting of verbs and words with several lexical meanings. 2. Search queries were processed to ensure comparability of results: the transformation of data to a 10-point scale, elimination of popularity peaks, detrending, and deseasoning. The proposed methodology for keyword search and Google Trends processing was implemented in the form of a script in the Python programming language. 3. Block and summary integral indicators of social comfort were constructed using the first modified principal component resulting in weighting coefficients values of block components. According to the study, social comfort is described by 12 blocks: ‘health’, ‘education’, ‘social support’, ‘financial situation’, ‘employment’, ‘housing’, ‘ethical norms’, ‘security’, ‘political stability’, ‘leisure’, ‘environment’, ‘infrastructure’. According to the model, the summary integral indicator increased by 54% and was 4.631 points; the average annual rate was 3.6%, which is higher than the rate of economic growth by 2.7 p.p. The value of the indicator describing social comfort in Russia is determined by 26% by ‘social support’, 24% by ‘education’, 12% by ‘infrastructure’, 10% by ‘leisure’, and the remaining 28% by others. Among 25% of the most popular searches, 85% are of negative nature and are mainly related to the blocks ‘security’, ‘political stability’, ‘health’, for example, ‘crime rate’, ‘vulnerability’. Among the 25% most unpopular queries, 99% of the queries were positive and mostly related to the blocks ‘ethical norms’, ‘education’, ‘employment’, for example, ‘social package’, ‘recycling’. In conclusion, the introduction of the latent category ‘social comfort’ into the scientific vocabulary deepens the theory of the quality of life of the population in terms of the study of the involvement of an individual in the society and expanding the subjective aspect of the measurements of various indicators. Integral assessment of social comfort demonstrates the overall picture of the development of the phenomenon over time and space and quantitatively evaluates ongoing socio-economic policy. The application of big data in the assessment of latent categories gives stable results, which opens up possibilities for their practical implementation.

Keywords: big data, Google trends, integral indicator, social comfort

Procedia PDF Downloads 185
35124 The Application of Conceptual Metaphor Theory to the Treatment of Depression

Authors: Uma Kanth, Amy Cook

Abstract:

Conceptual Metaphor Theory (CMT) proposes that metaphor is fundamental to human thought. CMT utilizes embodied cognition, in that emotions are conceptualized as effects on the body because of a coupling of one’s bodily experiences and one’s somatosensory system. Time perception is a function of embodied cognition and conceptual metaphor in that one’s experience of time is inextricably dependent on one’s perception of the world around them. A hallmark of depressive disorders is the distortion in one’s perception of time, such as neurological dysfunction and psychomotor retardation, and yet, to the author’s best knowledge, previous studies have not before linked CMT, embodied cognition, and depressive disorders. Therefore, the focus of this paper is the investigation of how the applications of CMT and embodied cognition (especially regarding time perception) have promise in improving current techniques to treat depressive disorders. This paper aimed to extend, through a thorough review of literature, the theoretical basis required to further research into CMT and embodied cognition’s application in treating time distortion related symptoms of depressive disorders. Future research could include the development of brain training technologies that capitalize on the principles of CMT, with the aim of promoting cognitive remediation and cognitive activation to mitigate symptoms of depressive disorder.

Keywords: depression, conceptual metaphor theory, embodied cognition, time

Procedia PDF Downloads 149
35123 Comparison of Interactive Performance of Clicking Tasks Using Cursor Control Devices under Different Feedback Modes

Authors: Jinshou Shi, Xiaozhou Zhou, Yingwei Zhou, Tuoyang Zhou, Ning Li, Chi Zhang, Zhanshuo Zhang, Ziang Chen

Abstract:

In order to select the optimal interaction method for common computer click tasks, the click experiment test adopts the ISO 9241-9 task paradigm, using four common operations: mouse, trackball, touch, and eye control under visual feedback, auditory feedback, and no feedback. Through data analysis of various parameters of movement time, throughput, and accuracy, it is found that the movement time of touch-control is the shortest, the operation accuracy and throughput are higher than others, and the overall operation performance is the best. In addition, the motion time of the click operation with auditory feedback is significantly lower than the other two feedback methods in each operation mode experiment. In terms of the size of the click target, it is found that when the target is too small (less than 14px), the click performance of all aspects is reduced, so it is proposed that the design of the interface button should not be less than 28px. In this article, we discussed in detail the advantages and disadvantages of the operation and feedback methods, and the results of the discussion of the click operation can be applied to the design of the buttons in the interactive interface.

Keywords: cursor control performance, feedback, human computer interaction, throughput

Procedia PDF Downloads 180
35122 Compressible Lattice Boltzmann Method for Turbulent Jet Flow Simulations

Authors: K. Noah, F.-S. Lien

Abstract:

In Computational Fluid Dynamics (CFD), there are a variety of numerical methods, of which some depend on macroscopic model representatives. These models can be solved by finite-volume, finite-element or finite-difference methods on a microscopic description. However, the lattice Boltzmann method (LBM) is considered to be a mesoscopic particle method, with its scale lying between the macroscopic and microscopic scales. The LBM works well for solving incompressible flow problems, but certain limitations arise from solving compressible flows, particularly at high Mach numbers. An improved lattice Boltzmann model for compressible flow problems is presented in this research study. A higher-order Taylor series expansion of the Maxwell equilibrium distribution function is used to overcome limitations in LBM when solving high-Mach-number flows. Large eddy simulation (LES) is implemented in LBM to simulate turbulent jet flows. The results have been validated with available experimental data for turbulent compressible free jet flow at subsonic speeds.

Keywords: compressible lattice Boltzmann method, multiple relaxation times, large eddy simulation, turbulent jet flows

Procedia PDF Downloads 258
35121 Predicting Provider Service Time in Outpatient Clinics Using Artificial Intelligence-Based Models

Authors: Haya Salah, Srinivas Sharan

Abstract:

Healthcare facilities use appointment systems to schedule their appointments and to manage access to their medical services. With the growing demand for outpatient care, it is now imperative to manage physician's time effectively. However, high variation in consultation duration affects the clinical scheduler's ability to estimate the appointment duration and allocate provider time appropriately. Underestimating consultation times can lead to physician's burnout, misdiagnosis, and patient dissatisfaction. On the other hand, appointment durations that are longer than required lead to doctor idle time and fewer patient visits. Therefore, a good estimation of consultation duration has the potential to improve timely access to care, resource utilization, quality of care, and patient satisfaction. Although the literature on factors influencing consultation length abound, little work has done to predict it using based data-driven approaches. Therefore, this study aims to predict consultation duration using supervised machine learning algorithms (ML), which predicts an outcome variable (e.g., consultation) based on potential features that influence the outcome. In particular, ML algorithms learn from a historical dataset without explicitly being programmed and uncover the relationship between the features and outcome variable. A subset of the data used in this study has been obtained from the electronic medical records (EMR) of four different outpatient clinics located in central Pennsylvania, USA. Also, publicly available information on doctor's characteristics such as gender and experience has been extracted from online sources. This research develops three popular ML algorithms (deep learning, random forest, gradient boosting machine) to predict the treatment time required for a patient and conducts a comparative analysis of these algorithms with respect to predictive performance. The findings of this study indicate that ML algorithms have the potential to predict the provider service time with superior accuracy. While the current approach of experience-based appointment duration estimation adopted by the clinic resulted in a mean absolute percentage error of 25.8%, the Deep learning algorithm developed in this study yielded the best performance with a MAPE of 12.24%, followed by gradient boosting machine (13.26%) and random forests (14.71%). Besides, this research also identified the critical variables affecting consultation duration to be patient type (new vs. established), doctor's experience, zip code, appointment day, and doctor's specialty. Moreover, several practical insights are obtained based on the comparative analysis of the ML algorithms. The machine learning approach presented in this study can serve as a decision support tool and could be integrated into the appointment system for effectively managing patient scheduling.

Keywords: clinical decision support system, machine learning algorithms, patient scheduling, prediction models, provider service time

Procedia PDF Downloads 108
35120 Neural Network-based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children

Authors: Budhvin T. Withana, Sulochana Rupasinghe

Abstract:

The problem of Dyslexia and Dysgraphia, two learning disabilities that affect reading and writing abilities, respectively, is a major concern for the educational system. Due to the complexity and uniqueness of the Sinhala language, these conditions are especially difficult for children who speak it. The traditional risk detection methods for Dyslexia and Dysgraphia frequently rely on subjective assessments, making it difficult to cover a wide range of risk detection and time-consuming. As a result, diagnoses may be delayed and opportunities for early intervention may be lost. The project was approached by developing a hybrid model that utilized various deep learning techniques for detecting risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16 and YOLOv8 were integrated to detect the handwriting issues, and their outputs were fed into an MLP model along with several other input data. The hyperparameters of the MLP model were fine-tuned using Grid Search CV, which allowed for the optimal values to be identified for the model. This approach proved to be effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention of these conditions. The Resnet50 model achieved an accuracy of 0.9804 on the training data and 0.9653 on the validation data. The VGG16 model achieved an accuracy of 0.9991 on the training data and 0.9891 on the validation data. The MLP model achieved an impressive training accuracy of 0.99918 and a testing accuracy of 0.99223, with a loss of 0.01371. These results demonstrate that the proposed hybrid model achieved a high level of accuracy in predicting the risk of Dyslexia and Dysgraphia.

Keywords: neural networks, risk detection system, Dyslexia, Dysgraphia, deep learning, learning disabilities, data science

Procedia PDF Downloads 79
35119 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 47
35118 Midface Trauma: Outpatient Follow-Up and Surgical Treatment Times

Authors: Divya Pathak, James Sloane

Abstract:

Surgical treatment of midface fractures should ideally occur within two weeks of injury, after which bony healing and consolidation make the repair more difficult for the operating surgeon. The oral and maxillofacial unit at the Royal Surrey Hospital is the tertiary referral center for maxillofacial trauma from five regional hospitals. This is a complete audit cycle of midface trauma referrals managed over a one year period. The standard set was that clinical assessment of the midface fracture would take place in a consultant led outpatient clinic within 7 days, and when indicated, surgical fixation would occur within 10 days of referral. Retrospective data was collected over one year (01/11/2018 - 31/12/2019). Three key changes were implemented: an IT referral mailbox, standardization of an on-call trauma table, and creation of a trauma theatre list. Re-audit was carried out over six months completing the cycle. 283 midface fracture referrals were received, of which 22 patients needed surgical fixation. The average time from referral to outpatient follow-up improved from 14.5 days to 8.3 days, and time from referral to surgery improved from 21.5 days to 11.6 days. Changes implemented in this audit significantly improved patient prioritization to appropriate outpatient clinics and shortened time to surgical intervention.

Keywords: maxillofacial trauma, midface trauma, oral and maxillofacial surgery, surgery fixation

Procedia PDF Downloads 126
35117 150 KVA Multifunction Laboratory Test Unit Based on Power-Frequency Converter

Authors: Bartosz Kedra, Robert Malkowski

Abstract:

This paper provides description and presentation of laboratory test unit built basing on 150 kVA power frequency converter and Simulink RealTime platform. Assumptions, based on criteria which load and generator types may be simulated using discussed device, are presented, as well as control algorithm structure. As laboratory setup contains transformer with thyristor controlled tap changer, a wider scope of setup capabilities is presented. Information about used communication interface, data maintenance, and storage solution as well as used Simulink real-time features is presented. List and description of all measurements are provided. Potential of laboratory setup modifications is evaluated. For purposes of Rapid Control Prototyping, a dedicated environment was used Simulink RealTime. Therefore, load model Functional Unit Controller is based on a PC computer with I/O cards and Simulink RealTime software. Simulink RealTime was used to create real-time applications directly from Simulink models. In the next step, applications were loaded on a target computer connected to physical devices that provided opportunity to perform Hardware in the Loop (HIL) tests, as well as the mentioned Rapid Control Prototyping process. With Simulink RealTime, Simulink models were extended with I/O cards driver blocks that made automatic generation of real-time applications and performing interactive or automated runs on a dedicated target computer equipped with a real-time kernel, multicore CPU, and I/O cards possible. Results of performed laboratory tests are presented. Different load configurations are described and experimental results are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area and arbitrary active and reactive power regulation basing on defined schedule.

Keywords: MATLAB, power converter, Simulink Real-Time, thyristor-controlled tap changer

Procedia PDF Downloads 306
35116 Remotely Sensed Data Fusion to Extract Vegetation Cover in the Cultural Park of Tassili, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Tassili, occupying a large area of Algeria, is characterized by a rich vegetative biodiversity to be preserved and managed both in time and space. The management of a large area (case of Tassili), by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information etc.), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Multispectral imaging sensors have been very useful in the last decade in very interesting applications of remote sensing. They can aid in several domains such as the de¬tection and identification of diverse surface targets, topographical details, and geological features. In this work, we try to extract vegetative areas using fusion techniques between data acquired from sensor on-board the Earth Observing 1 (EO-1) satellite and Landsat ETM+ and TM sensors. We have used images acquired over the Oasis of Djanet in the National Park of Tassili in the south of Algeria. Fusion technqiues were applied on the obtained image to extract the vegetative fraction of the different classes of land use. We compare the obtained results in vegetation end member extraction with vegetation indices calculated from both Hyperion and other multispectral sensors.

Keywords: Landsat ETM+, EO1, data fusion, vegetation, Tassili, Algeria

Procedia PDF Downloads 417
35115 Spatial Behavioral Model-Based Dynamic Data-Driven Diagram Information Model

Authors: Chiung-Hui Chen

Abstract:

Diagram and drawing are important ways to communicate and the reproduce of architectural design, Due to the development of information and communication technology, the professional thinking of architecture and interior design are also change rapidly. In development process of design, diagram always play very important role. This study is based on diagram theories, observe and record interaction between man and objects, objects and space, and space and time in a modern nuclear family. Construct a method for diagram to systematically and visualized describe the space plan of a modern nuclear family toward a intelligent design, to assist designer to retrieve information and check/review event pattern of past and present.

Keywords: digital diagram, information model, context aware, data analysis

Procedia PDF Downloads 323
35114 Infant and Young Child-Feeding Practices in Mongolia

Authors: Otgonjargal Damdinbaljir

Abstract:

Background: Infant feeding practices have a major role in determining the nutritional status of children and are associated with household socioeconomic and demographic factors. In 2010, Mongolia used WHO 2008 edition of Indicators for assessing infant and young child feeding practices for the first time. Objective: To evaluate the feeding status of infants and young children under 2 years old in Mongolia. Materials and Methods: The study was conducted by cluster random sampling. The data on breastfeeding and complementary food supplement of the 350 infants and young children aged 0-23 months in 21 provinces of the 4 economic regions of the country and capital Ulaanbaatar city were collected through questionnaires. The feeding status was analyzed according to the WHO 2008 edition of Indicators for assessing infant and young child feeding practices. Analysis of data: Survey data was analysed using the PASW statistics 18.0 and EPI INFO 2000 software. For calculation of overall measures for the entire survey sample, analyses were stratified by region. Age-specific feeding patterns were described using frequencies, proportions and survival analysis. Logistic regression was done with feeding practice as dependent and socio demographic factors as independent variables. Simple proportions were calculated for each IYCF indicator. The differences in the feeding practices between sexes and age-groups, if any, were noted using chi-square test. Ethics: The Ethics Committee under the auspices of the Ministry of Health approved the study. Results: A total of 350 children aged 0-23 months were investigated. The rate of ever breastfeeding of children aged 0-23 months reached up to 98.2%, while the percentage of early initiation of breastfeeding was only 85.5%. The rates of exclusive breastfeeding under 6 months, continued breastfeeding for 1 year, and continued breastfeeding for 2 years were 71.3%, 74% and 54.6%, respectively. The median time of giving complementary food was the 6th month and the weaning time was the 9th month. The rate of complementary food supplemented from 6th-8th month in time was 80.3%. The rates of minimum dietary diversity, minimum meal frequency, and consumption of iron-rich or iron-fortified foods among children aged 6-23 months were 52.1%, 80.8% (663/813) and 30.1%, respectively. Conclusions: The main problems revealed from the study were inadequate category and frequency of complementary food, and the low rate of consumption of iron-rich or iron-fortified foods were the main issues to be concerned on infant feeding in Mongolia. Our findings have highlighted the need to encourage mothers to enrich their traditional wheat- based complementary foods add more animal source foods and vegetables.

Keywords: complementary feeding, early initiation of breastfeeding, exclusive breastfeeding, minimum meal frequency

Procedia PDF Downloads 461
35113 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome

Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder

Abstract:

Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.

Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps

Procedia PDF Downloads 213
35112 Advanced Stability Criterion for Time-Delayed Systems of Neutral Type and Its Application

Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon

Abstract:

This paper investigates stability problem for linear systems of neutral type with time-varying delay. By constructing various Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient stability conditions for the systems are established in terms of linear matrix inequalities (LMIs), which can be easily solved by various effective optimization algorithms. Finally, some illustrative examples are given to show the effectiveness of the proposed criterion.

Keywords: neutral systems, time-delay, stability, Lyapnov method, LMI

Procedia PDF Downloads 335
35111 Comparison of Two Transcranial Magnetic Stimulation Protocols on Spasticity in Multiple Sclerosis - Pilot Study of a Randomized and Blind Cross-over Clinical Trial

Authors: Amanda Cristina da Silva Reis, Bruno Paulino Venâncio, Cristina Theada Ferreira, Andrea Fialho do Prado, Lucimara Guedes dos Santos, Aline de Souza Gravatá, Larissa Lima Gonçalves, Isabella Aparecida Ferreira Moretto, João Carlos Ferrari Corrêa, Fernanda Ishida Corrêa

Abstract:

Objective: To compare two protocols of Transcranial Magnetic Stimulation (TMS) on quadriceps muscle spasticity in individuals diagnosed with Multiple Sclerosis (MS). Method: Clinical, crossover study, in which six adult individuals diagnosed with MS and spasticity in the lower limbs were randomized to receive one session of high-frequency (≥5Hz) and low-frequency (≤ 1Hz) TMS on motor cortex (M1) hotspot for quadriceps muscle, with a one-week interval between the sessions. To assess the spasticity was applied the Ashworth scale and were analyzed the latency time (ms) of the motor evoked potential (MEP) and the central motor conduction time (CMCT) of the bilateral quadriceps muscle. Assessments were performed before and after each intervention. The difference between groups was analyzed using the Friedman test, with a significance level of 0.05 adopted. Results: All statistical analyzes were performed using the SPSS Statistic version 26 programs, with a significance level established for the analyzes at p<0.05. Shapiro Wilk normality test. Parametric data were represented as mean and standard deviation for non-parametric variables, median and interquartile range, and frequency and percentage for categorical variables. There was no clinical change in quadriceps spasticity assessed using the Ashworth scale for the 1 Hz (p=0.813) and 5 Hz (p= 0.232) protocols for both limbs. Motor Evoked Potential latency time: in the 5hz protocol, there was no significant change for the contralateral side from pre to post-treatment (p>0.05), and for the ipsilateral side, there was a decrease in latency time of 0.07 seconds (p<0.05 ); for the 1Hz protocol there was an increase of 0.04 seconds in the latency time (p<0.05) for the contralateral side to the stimulus, and for the ipsilateral side there was a decrease in the latency time of 0.04 seconds (p=<0.05), with a significant difference between the contralateral (p=0.007) and ipsilateral (p=0.014) groups. Central motor conduction time in the 1Hz protocol, there was no change for the contralateral side (p>0.05) and for the ipsilateral side (p>0.05). In the 5Hz protocol for the contralateral side, there was a small decrease in latency time (p<0.05) and for the ipsilateral side, there was a decrease of 0.6 seconds in the latency time (p<0.05) with a significant difference between groups (p=0.019). Conclusion: A high or low-frequency session does not change spasticity, but it is observed that when the low-frequency protocol was performed, there was an increase in latency time on the stimulated side, and a decrease in latency time on the non-stimulated side, considering then that inhibiting the motor cortex increases cortical excitability on the opposite side.

Keywords: multiple sclerosis, spasticity, motor evoked potential, transcranial magnetic stimulation

Procedia PDF Downloads 70
35110 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 325
35109 Determinants of Extra Charges for Container Shipments: A Case Study of Nexus Zone Logistics

Authors: Zety Shakila Binti Mohd Yusof, Muhammad Adib Bin Ishak, Hajah Fatimah Binti Hussein

Abstract:

The international shipping business is related to numerous controls or regulations of export and import shipments. It is costly and time consuming, and when something goes wrong or when the buyer or seller fails to comply with the regulations, it can result in penalties, delays, and unexpected costs etc. For the focus of this study, the researchers have selected a local forwarder that provides forwarding and clearance services, Nexus Zone Logistics. It was identified that this company currently has many extra costs to be paid including local and detention charges, which negatively impacts the flow of income and reduces overall stability. Two variables have been identified as factors of extra charges; loaded containers entering the port by exceeded closing time and late delivery of empty containers to the container yard. This study is a qualitative in nature and the secondary data collected was analyzed using self-administered observation. The findings of this study were covered by one selected case for each export and import shipment between July and December 2014. The data were analyzed using frequency analysis based on tables and graphs. The researcher recommends Nexus Zone Logistics impose a 1% deposit payment per container for each shipment (export and import) to its customers.

Keywords: international shipping, export and import, detention charges, container shipment

Procedia PDF Downloads 366
35108 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1163
35107 Transforming Data Science Curriculum Through Design Thinking

Authors: Samar Swaid

Abstract:

Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.

Keywords: data science, design thinking, AI, currculum, transformation

Procedia PDF Downloads 63
35106 Examination of How Do Smart Watches Influence the Market of Luxury Watches with Particular Regard of the Buying-Reasons

Authors: Christopher Benedikt Jakob

Abstract:

In our current society, there is no need to take a look at the wristwatch to know the exact time. Smartphones, the watch in the car or the computer watch, inform us about the time too. Over hundreds of years, luxury watches have held a fascination for human beings. Consumers buy watches that cost thousands of euros, although they could buy much cheaper watches which also fulfill the function to indicate the correct time. This shows that the functional value has got a minor meaning with reference to the buying-reasons as regards luxury watches. For a few years, people have an increased demand to track data like their walking distance per day or to track their sleep for example. Smart watches enable consumers to get information about these data. There exists a trend that people intend to optimise parts of their social life, and thus they get the impression that they are able to optimise themselves as human beings. With the help of smart watches, they are able to optimise parts of their productivity and to realise their targets at the same time. These smart watches are also offered as luxury models, and the question is: how will customers of traditional luxury watches react? Therefore this study has the intention to give answers to the question why people are willing to spend an enormous amount of money on the consumption of luxury watches. The self-expression model, the relationship basis model, the functional benefit representation model and the means-end-theory are chosen as an appropriate methodology to find reasons why human beings purchase specific luxury watches and luxury smart watches. This evaluative approach further discusses these strategies concerning for example if consumers buy luxury watches/smart watches to express the current self or the ideal self and if human beings make decisions on expected results. The research critically evaluates that relationships are compared on the basis of their advantages. Luxury brands offer socio-emotional advantages like social functions of identification and that the strong brand personality of luxury watches and luxury smart watches helps customers to structure and retrieve brand awareness which simplifies the process of decision-making. One of the goals is to identify if customers know why they like specific luxury watches and dislike others although they are produced in the same country and cost comparable prices. It is very obvious that the market for luxury watches especially for luxury smart watches is changing way faster than it has been in the past. Therefore the research examines the market changing parameters in detail.

Keywords: buying-behaviour, brand management, consumer, luxury watch, smart watch

Procedia PDF Downloads 193
35105 A Joint Possibilistic-Probabilistic Tool for Load Flow Uncertainty Assessment-Part I: Formulation

Authors: Morteza Aien, Masoud Rashidinejad, Mahmud Fotuhi-Firuzabad

Abstract:

As energetic and environmental issues are getting more and more attention all around the world, the penetration of distributed energy resources (DERs) mainly those harvesting renewable energies (REs) ascends with an unprecedented rate. This matter causes more uncertainties to appear in the power system context; ergo, the uncertainty analysis of the system performance is an obligation. The uncertainties of any system can be represented probabilistically or possibilistically. Since sufficient historical data about all the system variables is not available, therefore, they do not have a probability density function (PDF) and must be represented possibilistiacally. When some of system uncertain variables are probabilistic and some are possibilistic, neither the conventional pure probabilistic nor pure possibilistic methods can be implemented. Hence, a combined solution is appealed. The first of this two-paper series formulates a new possibilistic-probabilistic tool for the load flow uncertainty assessment. The proposed methodology is based on the evidence theory and joint propagation of possibilistic and probabilistic uncertainties. This possibilistic- probabilistic formulation is solved in the second companion paper in an uncertain load flow (ULF) study problem.

Keywords: probabilistic uncertainty modeling, possibilistic uncertainty modeling, uncertain load flow, wind turbine generator

Procedia PDF Downloads 545
35104 Implementation of CNV-CH Algorithm Using Map-Reduce Approach

Authors: Aishik Deb, Rituparna Sinha

Abstract:

We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.

Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing

Procedia PDF Downloads 118
35103 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 381
35102 Effect of Measured and Calculated Static Torque on Instantaneous Torque Profile of Switched Reluctance Motor

Authors: Ali Asghar Memon

Abstract:

The simulation modeling of switched reluctance (SR) machine often relies and uses the three data tables identified as static torque characteristics that include flux linkage characteristics, co energy characteristics and static torque characteristics separately. It has been noticed from the literature that the data of static torque used in the simulation model is often calculated so far the literature is concerned. This paper presents the simulation model that include the data of measured and calculated static torque separately to see its effect on instantaneous torque profile of the machine. This is probably for the first time so far the literature review is concerned that static torque from co energy information, and measured static torque directly from experiments are separately used in the model. This research is helpful for accurate modeling of switched reluctance drive.

Keywords: static characteristics, current chopping, flux linkage characteristics, switched reluctance motor

Procedia PDF Downloads 276
35101 Time Variance and Spillover Effects between International Crude Oil Price and Ten Emerging Equity Markets

Authors: Murad A. Bein

Abstract:

This paper empirically examines the time-varying relationship and spillover effects between the international crude oil price and ten emerging equity markets, namely three oil-exporting countries (Brazil, Mexico, and Russia) and seven Central and Eastern European (CEE) countries (Bulgaria, Croatia, Czech Republic, Hungary, Poland, Romania, and Slovakia). The results revealed that there are spillover effects from oil markets into almost all emerging equity markets save Slovakia. Besides, the oil supply glut had a homogenous effect on the emerging markets, both net oil-exporting, and oil-importing countries (CEE). Further, the time variance drastically increased during financial turmoil. Indeed, the time variance remained high from 2009 to 2012 in response to aggregate demand shocks (global financial crisis and Eurozone debt crisis) and quantitative easing measures. Interestingly, the time variance was slightly higher for the oil-exporting countries than for some of the CEE countries. Decision-makers in emerging economies should therefore seek policy coordination when dealing with financial turmoil.

Keywords: crude oil, spillover effects, emerging equity, time-varying, aggregate demand shock

Procedia PDF Downloads 109
35100 Does Pakistan Stock Exchange Offer Diversification Benefits to Regional and International Investors: A Time-Frequency (Wavelets) Analysis

Authors: Syed Jawad Hussain Shahzad, Muhammad Zakaria, Mobeen Ur Rehman, Saniya Khaild

Abstract:

This study examines the co-movement between the Pakistan, Indian, S&P 500 and Nikkei 225 stock markets using weekly data from 1998 to 2013. The time-frequency relationship between the selected stock markets is conducted by using measures of continuous wavelet power spectrum, cross-wavelet transform and cross (squared) wavelet coherency. The empirical evidence suggests strong dependence between Pakistan and Indian stock markets. The co-movement of Pakistani index with U.S and Japanese, the developed markets, varies over time and frequency where the long-run relationship is dominant. The results of cross wavelet and wavelet coherence analysis indicate moderate covariance and correlation between stock indexes and the markets are in phase (i.e. cyclical in nature) over varying durations. Pakistan stock market was lagging during the entire period in relation to Indian stock market, corresponding to the 8~32 and then 64~256 weeks scale. Similar findings are evident for S&P 500 and Nikkei 225 indexes, however, the relationship occurs during the later period of study. All three wavelet indicators suggest strong evidence of higher co-movement during 2008-09 global financial crises. The empirical analysis reveals a strong evidence that the portfolio diversification benefits vary across frequencies and time. This analysis is unique and have several practical implications for regional and international investors while assigning the optimal weightage of different assets in portfolio formulation.

Keywords: co-movement, Pakistan stock exchange, S&P 500, Nikkei 225, wavelet analysis

Procedia PDF Downloads 348