Search results for: time series feature extraction
20368 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks
Authors: Van Trieu, Shouhuai Xu, Yusheng Feng
Abstract:
Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.Keywords: causality, multilevel graph, cyber-attacks, prediction
Procedia PDF Downloads 15620367 A Weighted Approach to Unconstrained Iris Recognition
Authors: Yao-Hong Tsai
Abstract:
This paper presents a weighted approach to unconstrained iris recognition. Nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.Keywords: authentication, iris recognition, adaboost, local binary pattern
Procedia PDF Downloads 22320366 A Study on the Application of Machine Learning and Deep Learning Techniques for Skin Cancer Detection
Authors: Hritwik Ghosh, Irfan Sadiq Rahat, Sachi Nandan Mohanty, J. V. R. Ravindra
Abstract:
In the rapidly evolving landscape of medical diagnostics, the early detection and accurate classification of skin cancer remain paramount for effective treatment outcomes. This research delves into the transformative potential of Artificial Intelligence (AI), specifically Deep Learning (DL), as a tool for discerning and categorizing various skin conditions. Utilizing a diverse dataset of 3,000 images representing nine distinct skin conditions, we confront the inherent challenge of class imbalance. This imbalance, where conditions like melanomas are over-represented, is addressed by incorporating class weights during the model training phase, ensuring an equitable representation of all conditions in the learning process. Our pioneering approach introduces a hybrid model, amalgamating the strengths of two renowned Convolutional Neural Networks (CNNs), VGG16 and ResNet50. These networks, pre-trained on the ImageNet dataset, are adept at extracting intricate features from images. By synergizing these models, our research aims to capture a holistic set of features, thereby bolstering classification performance. Preliminary findings underscore the hybrid model's superiority over individual models, showcasing its prowess in feature extraction and classification. Moreover, the research emphasizes the significance of rigorous data pre-processing, including image resizing, color normalization, and segmentation, in ensuring data quality and model reliability. In essence, this study illuminates the promising role of AI and DL in revolutionizing skin cancer diagnostics, offering insights into its potential applications in broader medical domains.Keywords: artificial intelligence, machine learning, deep learning, skin cancer, dermatology, convolutional neural networks, image classification, computer vision, healthcare technology, cancer detection, medical imaging
Procedia PDF Downloads 8520365 TMIF: Transformer-Based Multi-Modal Interactive Fusion for Rumor Detection
Authors: Jiandong Lv, Xingang Wang, Cuiling Shao
Abstract:
The rapid development of social media platforms has made it one of the important news sources. While it provides people with convenient real-time communication channels, fake news and rumors are also spread rapidly through social media platforms, misleading the public and even causing bad social impact in view of the slow speed and poor consistency of artificial rumor detection. We propose an end-to-end rumor detection model-TIMF, which captures the dependencies between multimodal data based on the interactive attention mechanism, uses a transformer for cross-modal feature sequence mapping and combines hybrid fusion strategies to obtain decision results. This paper verifies two multi-modal rumor detection datasets and proves the superior performance and early detection performance of the proposed model.Keywords: hybrid fusion, multimodal fusion, rumor detection, social media, transformer
Procedia PDF Downloads 24420364 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components
Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura
Abstract:
This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding
Procedia PDF Downloads 13720363 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution
Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi
Abstract:
Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion
Procedia PDF Downloads 17220362 Maxillofacial Trauma: A Case of Diacapitular Condylar Fracture
Authors: Krishna Prasad Regmi, Jun-Bo Tu, Cheng-Qun Hou, Li-Feng Li
Abstract:
Maxillofacial trauma in a pediatric group of patients is particularly challenging, as these patients have significant differences from adults as far as the facial skeleton is concerned. Mandibular condylar fractures are common presentations to hospitals across the globe and remain the most important cause of temporomandibular joint (TMJ) ankylosis. The etiology and epidemiology of pediatric trauma involving the diacapitular condylar fractures (DFs) have been reported in a large series of patients. Nevertheless, little is known about treatment protocols for DFs in children. Accordingly, the treatment modalities for the management of pediatric fractures also differ. We suggest following the PDA and intracapsular ABC classification of condylar fracture to increase the overall postoperative satisfaction level that bypasses the change of subjective feelings of patients’ from preoperative to the postoperative condition. At the same time, use of 3-D technology and surgical navigation may also increase treatment accuracy.Keywords: maxillofacial trauma, diacapitular fracture, condylar fracture, PDA classification
Procedia PDF Downloads 26820361 4D Monitoring of Subsurface Conditions in Concrete Infrastructure Prior to Failure Using Ground Penetrating Radar
Authors: Lee Tasker, Ali Karrech, Jeffrey Shragge, Matthew Josh
Abstract:
Monitoring for the deterioration of concrete infrastructure is an important assessment tool for an engineer and difficulties can be experienced with monitoring for deterioration within an infrastructure. If a failure crack, or fluid seepage through such a crack, is observed from the surface often the source location of the deterioration is not known. Geophysical methods are used to assist engineers with assessing the subsurface conditions of materials. Techniques such as Ground Penetrating Radar (GPR) provide information on the location of buried infrastructure such as pipes and conduits, positions of reinforcements within concrete blocks, and regions of voids/cavities behind tunnel lining. This experiment underlines the application of GPR as an infrastructure-monitoring tool to highlight and monitor regions of possible deterioration within a concrete test wall due to an increase in the generation of fractures; in particular, during a time period of applied load to a concrete wall up to and including structural failure. A three-point load was applied to a concrete test wall of dimensions 1700 x 600 x 300 mm³ in increments of 10 kN, until the wall structurally failed at 107.6 kN. At each increment of applied load, the load was kept constant and the wall was scanned using GPR along profile lines across the wall surface. The measured radar amplitude responses of the GPR profiles, at each applied load interval, were reconstructed into depth-slice grids and presented at fixed depth-slice intervals. The corresponding depth-slices were subtracted from each data set to compare the radar amplitude response between datasets and monitor for changes in the radar amplitude response. At lower values of applied load (i.e., 0-60 kN), few changes were observed in the difference of radar amplitude responses between data sets. At higher values of applied load (i.e., 100 kN), closer to structural failure, larger differences in radar amplitude response between data sets were highlighted in the GPR data; up to 300% increase in radar amplitude response at some locations between the 0 kN and 100 kN radar datasets. Distinct regions were observed in the 100 kN difference dataset (i.e., 100 kN-0 kN) close to the location of the final failure crack. The key regions observed were a conical feature located between approximately 3.0-12.0 cm depth from surface and a vertical linear feature located approximately 12.1-21.0 cm depth from surface. These key regions have been interpreted as locations exhibiting an increased change in pore-space due to increased mechanical loading, or locations displaying an increase in volume of micro-cracks, or locations showing the development of a larger macro-crack. The experiment showed that GPR is a useful geophysical monitoring tool to assist engineers with highlighting and monitoring regions of large changes of radar amplitude response that may be associated with locations of significant internal structural change (e.g. crack development). GPR is a non-destructive technique that is fast to deploy in a production setting. GPR can assist with reducing risk and costs in future infrastructure maintenance programs by highlighting and monitoring locations within the structure exhibiting large changes in radar amplitude over calendar-time.Keywords: 4D GPR, engineering geophysics, ground penetrating radar, infrastructure monitoring
Procedia PDF Downloads 17820360 On the Impact of Oil Price Fluctuations on Stock Markets: A Multivariate Long-Memory GARCH Framework
Authors: Manel Youssef, Lotfi Belkacem
Abstract:
This paper employs multivariate long memory GARCH models to simultaneously estimate mean and conditional variance spillover effects between oil prices and different financial markets. Since different financial assets are traded based on these market sector returns, it’s important for financial market participants to understand the volatility transmission mechanism over time and across these series in order to make optimal portfolio allocation decisions. We examine weekly returns from January 1, 2003 to November 30, 2012 and find evidence of significant transmission of shocks and volatilities between oil prices and some of the examined financial markets. The findings support the idea of cross-market hedging and sharing of common information by investors.Keywords: oil prices, stock indices returns, oil volatility, contagion, DCC-multivariate (FI) GARCH
Procedia PDF Downloads 53020359 A Biomimetic Approach for the Multi-Objective Optimization of Kinetic Façade Design
Authors: Do-Jin Jang, Sung-Ah Kim
Abstract:
A kinetic façade responds to user requirements and environmental conditions. In designing a kinetic façade, kinetic patterns play a key role in determining its performance. This paper proposes a biomimetic method for the multi-objective optimization for kinetic façade design. The autonomous decentralized control system is combined with flocking algorithm. The flocking agents are autonomously reacting to sensor values and bring about kinetic patterns changing over time. A series of experiments were conducted to verify the potential and limitations of the flocking based decentralized control. As a result, it could show the highest performance balancing multiple objectives such as solar radiation and openness among the comparison group.Keywords: biomimicry, flocking algorithm, autonomous decentralized control, multi-objective optimization
Procedia PDF Downloads 51420358 Quantitative Texture Analysis of Shoulder Sonography for Rotator Cuff Lesion Classification
Authors: Chung-Ming Lo, Chung-Chien Lee
Abstract:
In many countries, the lifetime prevalence of shoulder pain is up to 70%. In America, the health care system spends 7 billion per year about the healthy issues of shoulder pain. With respect to the origin, up to 70% of shoulder pain is attributed to rotator cuff lesions This study proposed a computer-aided diagnosis (CAD) system to assist radiologists classifying rotator cuff lesions with less operator dependence. Quantitative features were extracted from the shoulder ultrasound images acquired using an ALOKA alpha-6 US scanner (Hitachi-Aloka Medical, Tokyo, Japan) with linear array probe (scan width: 36mm) ranging from 5 to 13 MHz. During examination, the postures of the examined patients are standard sitting position and are followed by the regular routine. After acquisition, the shoulder US images were drawn out from the scanner and stored as 8-bit images with pixel value ranging from 0 to 255. Upon the sonographic appearance, the boundary of each lesion was delineated by a physician to indicate the specific pattern for analysis. The three lesion categories for classification were composed of 20 cases of tendon inflammation, 18 cases of calcific tendonitis, and 18 cases of supraspinatus tear. For each lesion, second-order statistics were quantified in the feature extraction. The second-order statistics were the texture features describing the correlations between adjacent pixels in a lesion. Because echogenicity patterns were expressed via grey-scale. The grey-scale co-occurrence matrixes with four angles of adjacent pixels were used. The texture metrics included the mean and standard deviation of energy, entropy, correlation, inverse different moment, inertia, cluster shade, cluster prominence, and Haralick correlation. Then, the quantitative features were combined in a multinomial logistic regression classifier to generate a prediction model of rotator cuff lesions. Multinomial logistic regression classifier is widely used in the classification of more than two categories such as the three lesion types used in this study. In the classifier, backward elimination was used to select a feature subset which is the most relevant. They were selected from the trained classifier with the lowest error rate. Leave-one-out cross-validation was used to evaluate the performance of the classifier. Each case was left out of the total cases and used to test the trained result by the remaining cases. According to the physician’s assessment, the performance of the proposed CAD system was shown by the accuracy. As a result, the proposed system achieved an accuracy of 86%. A CAD system based on the statistical texture features to interpret echogenicity values in shoulder musculoskeletal ultrasound was established to generate a prediction model for rotator cuff lesions. Clinically, it is difficult to distinguish some kinds of rotator cuff lesions, especially partial-thickness tear of rotator cuff. The shoulder orthopaedic surgeon and musculoskeletal radiologist reported greater diagnostic test accuracy than general radiologist or ultrasonographers based on the available literature. Consequently, the proposed CAD system which was developed according to the experiment of the shoulder orthopaedic surgeon can provide reliable suggestions to general radiologists or ultrasonographers. More quantitative features related to the specific patterns of different lesion types would be investigated in the further study to improve the prediction.Keywords: shoulder ultrasound, rotator cuff lesions, texture, computer-aided diagnosis
Procedia PDF Downloads 28420357 Evaluation of Opposite Type Heterologous MAT Genes Transfer in the Filamentous Fungi Neofusicoccum mediterraneum and Verticillium dahliae
Authors: Stavros Palavouzis, Alexandra Triantafyllopoulou, Aliki Tzima, Epaminondas Paplomatas
Abstract:
Mating-type genes are present in most filamentous fungi, even though teleomorphs for all species have not been recorded. Our study tries to explore the effect of different growth conditions on the expression of MAT genes in Neofusicoccum mediterraneum. As such, selected isolates were grown in potato dextrose broth or in water agar supplemented with pine needles under a 12 h photoperiod, as well as in constant darkness. Mycelia and spores were collected at different time points, and RNA extraction was performed, with the extracted product being used for cDNA synthesis. New primers for MAT gene expression were designed while qPCR results are underway. The second part of the study involved the isolation and cloning in a selected pGEM-T vector of the Botryosphaeria dothidea MAT1 1 1 and MAT1 2 1 mating genes, including flanking regions. As a next step, the genes were amplified using newly designed primers with engineered restriction sites. Amplicons were excised and subsequently sub-cloned in appropriate binary vectors. The constructs were afterward inserted into Agrobacterium tumefaciens and utilized for Agrobacterium-mediated transformation (ATMT) of Neofusicoccum mediterraneum. At the same time, the transformation of a Verticillium dahliae tomato race 1 strain (70V) was performed as a control. While the procedure was successful in regards to V. dahliae, transformed strains of N. mediterraneum could not be obtained. At present, a new transformation protocol, which utilizes a combination of protoplast and Agro transformation, is being evaluated.Keywords: anamorph, heterothallism, perithecia, pycnidia, sexual stage
Procedia PDF Downloads 17920356 Cadmium Removal from Aqueous Solution Using Chitosan Beads Prepared from Shrimp Shell Extracted Chitosan
Authors: Bendjaballah Malek; Makhlouf Mohammed Rabeh; Boukerche Imane; Benhamza Mohammed El Hocine
Abstract:
In this study, chitosan was derived from Parapenaeus longirostris shrimp shells sourced from a local market in Annaba, eastern Algeria. The extraction process entailed four chemical stages: demineralization, deproteinization, decolorization, and deacetylation. The degree of deacetylation was calculated to be 80.86 %. The extracted chitosan was physically altered to synthesize chitosan beads and characterized via FTIR and XRD analysis. These beads were employed to eliminate cadmium ions from synthetic water. The batch adsorption process was optimized by analyzing the impact of contact time, pH, adsorbent dose, and temperature. The adsorption capacity of and Cd+2 on chitosan beads was found to be 6.83 mg/g and 7.94 mg/g, respectively. The kinetic adsorption of Cd+2 conformed to the pseudo-first-order model, while the isotherm study indicated that the Langmuir Isotherm model well described the adsorption of cadmium . A thermodynamic analysis demonstrated that the adsorption of Cd+2 on chitosan beads is spontaneous and exothermic.Keywords: Cd, chitosan, chitosanbeds, bioadsorbent
Procedia PDF Downloads 9720355 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques
Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev
Abstract:
Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.Keywords: data analysis, demand modeling, healthcare, medical facilities
Procedia PDF Downloads 14420354 Satellite Image Classification Using Firefly Algorithm
Authors: Paramjit Kaur, Harish Kundra
Abstract:
In the recent years, swarm intelligence based firefly algorithm has become a great focus for the researchers to solve the real time optimization problems. Here, firefly algorithm is used for the application of satellite image classification. For experimentation, Alwar area is considered to multiple land features like vegetation, barren, hilly, residential and water surface. Alwar dataset is considered with seven band satellite images. Firefly Algorithm is based on the attraction of less bright fireflies towards more brightener one. For the evaluation of proposed concept accuracy assessment parameters are calculated using error matrix. With the help of Error matrix, parameters of Kappa Coefficient, Overall Accuracy and feature wise accuracy parameters of user’s accuracy & producer’s accuracy can be calculated. Overall results are compared with BBO, PSO, Hybrid FPAB/BBO, Hybrid ACO/SOFM and Hybrid ACO/BBO based on the kappa coefficient and overall accuracy parameters.Keywords: image classification, firefly algorithm, satellite image classification, terrain classification
Procedia PDF Downloads 39720353 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles
Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang
Abstract:
With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering
Procedia PDF Downloads 12720352 Recycling of End of Life Concrete Based on C2CA Method
Authors: Somayeh Lotfi, Manuel Eggimann, Eckhard Wagner, Radosław Mróz, Jan Deja
Abstract:
One of the main environmental challenges in the construction industry is a strong social force to decrease the bulk transport of the building materials in urban environments. Considering this fact, applying more in-situ recycling technologies for Construction and Demolition Waste (CDW) is an urgent need. The European C2CA project develops a novel concrete recycling technology that can be performed purely mechanically and in situ. The technology consists of a combination of smart demolition, gentle grinding of the crushed concrete in an autogenous mill, and a novel dry classification technology called ADR to remove the fines. The feasibility of this recycling process was examined in demonstration projects involving in total 20,000 tons of End of Life (EOL) concrete from two office towers in Groningen, The Netherlands. This paper concentrates on the second demonstration project of C2CA, where EOL concrete was recycled on an industrial site. After recycling, the properties of the produced Recycled Aggregate (RA) were investigated, and results are presented. An experimental study was carried out on mechanical and durability properties of produced Recycled Aggregate Concrete (RAC) compared to those of the Natural Aggregate Concrete (NAC). The aim was to understand the importance of RA substitution, w/c ratio and type of cement to the properties of RAC. In this regard, two series of reference concrete with strength classes of C25/30 and C45/55 were produced using natural coarse aggregates (rounded and crushed) and natural sand. The RAC series were created by replacing parts of the natural aggregate, resulting in series of concrete with 0%, 20%, 50% and 100% of RA. Results show that the concrete mix design and type of cement have a decisive effect on the properties of RAC. On the other hand, the substitution of RA even at a high percentage replacement level has a minor and manageable impact on the performance of RAC. This result is a good indication towards the feasibility of using RA in structural concrete by modifying the mix design and using a proper type of cement.Keywords: C2CA, ADR, concrete recycling, recycled aggregate, durability
Procedia PDF Downloads 38920351 Circadian Clock and Subjective Time Perception: A Simple Open Source Application for the Analysis of Induced Time Perception in Humans
Authors: Agata M. Kołodziejczyk, Mateusz Harasymczuk, Pierre-Yves Girardin, Lucie Davidová
Abstract:
Subjective time perception implies connection to cognitive functions, attention, memory and awareness, but a little is known about connections with homeostatic states of the body coordinated by circadian clock. In this paper, we present results from experimental study of subjective time perception in volunteers performing physical activity on treadmill in various phases of their circadian rhythms. Subjects were exposed to several time illusions simulated by programmed timing systems. This study brings better understanding for further improvement of of work quality in isolated areas.Keywords: biological clock, light, time illusions, treadmill
Procedia PDF Downloads 33320350 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode
Authors: Girish Chavadappanavar
Abstract:
The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).Keywords: climate impact, regression analysis, yield and forecast model, sugar models
Procedia PDF Downloads 6920349 An Algebraic Geometric Imaging Approach for Automatic Dairy Cow Body Condition Scoring System
Authors: Thi Thi Zin, Pyke Tin, Ikuo Kobayashi, Yoichiro Horii
Abstract:
Today dairy farm experts and farmers have well recognized the importance of dairy cow Body Condition Score (BCS) since these scores can be used to optimize milk production, managing feeding system and as an indicator for abnormality in health even can be utilized to manage for having healthy calving times and process. In tradition, BCS measures are done by animal experts or trained technicians based on visual observations focusing on pin bones, pin, thurl and hook area, tail heads shapes, hook angles and short and long ribs. Since the traditional technique is very manual and subjective, the results can lead to different scores as well as not cost effective. Thus this paper proposes an algebraic geometric imaging approach for an automatic dairy cow BCS system. The proposed system consists of three functional modules. In the first module, significant landmarks or anatomical points from the cow image region are automatically extracted by using image processing techniques. To be specific, there are 23 anatomical points in the regions of ribs, hook bones, pin bone, thurl and tail head. These points are extracted by using block region based vertical and horizontal histogram methods. According to animal experts, the body condition scores depend mainly on the shape structure these regions. Therefore the second module will investigate some algebraic and geometric properties of the extracted anatomical points. Specifically, the second order polynomial regression is employed to a subset of anatomical points to produce the regression coefficients which are to be utilized as a part of feature vector in scoring process. In addition, the angles at thurl, pin, tail head and hook bone area are computed to extend the feature vector. Finally, in the third module, the extracted feature vectors are trained by using Markov Classification process to assign BCS for individual cows. Then the assigned BCS are revised by using multiple regression method to produce the final BCS score for dairy cows. In order to confirm the validity of proposed method, a monitoring video camera is set up at the milk rotary parlor to take top view images of cows. The proposed method extracts the key anatomical points and the corresponding feature vectors for each individual cows. Then the multiple regression calculator and Markov Chain Classification process are utilized to produce the estimated body condition score for each cow. The experimental results tested on 100 dairy cows from self-collected dataset and public bench mark dataset show very promising with accuracy of 98%.Keywords: algebraic geometric imaging approach, body condition score, Markov classification, polynomial regression
Procedia PDF Downloads 15620348 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions
Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez
Abstract:
In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval
Procedia PDF Downloads 23120347 Kinetic Study on Extracting Lignin from Black Liquor Using Deep Eutectic Solvents
Authors: Fatemeh Saadat Ghareh Bagh, Srimanta Ray, Jerald Lalman
Abstract:
Lignin, the largest inventory of organic carbon with a high caloric energy value is a major component in woody and non-woody biomass. In pulping mills, a large amount of the lignin is burned for energy. At the same time, the phenolic structure of lignin enables it to be converted to value-added compounds.This study has focused on extracting lignin from black liquor using deep eutectic solvents (DESs). Therefore, three choline chloride (ChCl)-DESs paired with lactic acid (LA) (1:11), oxalic acid.2H₂O (OX) (1:4), and malic acid (MA) (1:3) were synthesized at 90oC and atmospheric pressure. The kinetics of lignin recovery from black liquor using DES was investigated at three moderate temperatures (338, 353, and 368 K) at time intervals from 30 to 210 min. The extracted lignin (acid soluble lignin plus Klason lignin) was characterized by Fourier transform infrared spectroscopy (FTIR). The FTIR studies included comparing the extracted lignin with a model Kraft lignin. The extracted lignin was characterized spectrophotometrically to determine the acid soluble lignin (ASL) [TAPPI UM 250] fraction and Klason lignin was determined gravimetrically using TAPPI T 222 om02. The lignin extraction reaction using DESs was modeled by first-order reaction kinetics and the activation energy of the process was determined. The ChCl:LA-DES recovered lignin was 79.7±2.1% at 368K and a DES:BL ratio of 4:1 (v/v). The quantity of lignin extracted for the control solvent, [emim][OAc], was 77.5+2.2%. The activation energy measured for the LA-DES system was 22.7 KJ mol⁻¹, while the activation energy for the OX-DES and MA-DES systems were 7.16 KJ·mol⁻¹ and 8.66 KJ·mol⁻¹ when the total lignin recovery was 75.4 ±0.9% and 62.4 ±1.4, % respectively.Keywords: black liquor, deep eutectic solvents, kinetics, lignin
Procedia PDF Downloads 14420346 Detection of Parkinsonian Freezing of Gait
Authors: Sang-Hoon Park, Yeji Ho, Gwang-Moon Eom
Abstract:
Fast and accurate detection of Freezing of Gait (FOG) is desirable for appropriate application of cueing which has been shown to ameliorate FOG. Utilization of frequency spectrum of leg acceleration to derive the freeze index requires much calculation and it would lead to delayed cueing. We hypothesized that FOG can be reasonably detected from the time domain amplitude of foot acceleration. A time instant was recognized as FOG if the mean amplitude of the acceleration in the time window surrounding the time instant was in the specific FOG range. Parameters required in the FOG detection was optimized by simulated annealing. The suggested time domain methods showed performances comparable to those of frequency domain methods.Keywords: freezing of gait, detection, Parkinson's disease, time-domain method
Procedia PDF Downloads 44320345 Multisensory Science, Technology, Engineering and Mathematics Learning: Combined Hands-on and Virtual Science for Distance Learners of Food Chemistry
Authors: Paulomi Polly Burey, Mark Lynch
Abstract:
It has been shown that laboratory activities can help cement understanding of theoretical concepts, but it is difficult to deliver such an activity to an online cohort and issues such as occupational health and safety in the students’ learning environment need to be considered. Chemistry, in particular, is one of the sciences where practical experience is beneficial for learning, however typical university experiments may not be suitable for the learning environment of a distance learner. Food provides an ideal medium for demonstrating chemical concepts, and along with a few simple physical and virtual tools provided by educators, analytical chemistry can be experienced by distance learners. Food chemistry experiments were designed to be carried out in a home-based environment that 1) Had sufficient scientific rigour and skill-building to reinforce theoretical concepts; 2) Were safe for use at home by university students and 3) Had the potential to enhance student learning by linking simple hands-on laboratory activities with high-level virtual science. Two main components of the resources were developed, a home laboratory experiment component, and a virtual laboratory component. For the home laboratory component, students were provided with laboratory kits, as well as a list of supplementary inexpensive chemical items that they could purchase from hardware stores and supermarkets. The experiments used were typical proximate analyses of food, as well as experiments focused on techniques such as spectrophotometry and chromatography. Written instructions for each experiment coupled with video laboratory demonstrations were used to train students on appropriate laboratory technique. Data that students collected in their home laboratory environment was collated across the class through shared documents, so that the group could carry out statistical analysis and experience a full laboratory experience from their own home. For the virtual laboratory component, students were able to view a laboratory safety induction and advised on good characteristics of a home laboratory space prior to carrying out their experiments. Following on from this activity, students observed laboratory demonstrations of the experimental series they would carry out in their learning environment. Finally, students were embedded in a virtual laboratory environment to experience complex chemical analyses with equipment that would be too costly and sensitive to be housed in their learning environment. To investigate the impact of the intervention, students were surveyed before and after the laboratory series to evaluate engagement and satisfaction with the course. Students were also assessed on their understanding of theoretical chemical concepts before and after the laboratory series to determine the impact on their learning. At the end of the intervention, focus groups were run to determine which aspects helped and hindered learning. It was found that the physical experiments helped students to understand laboratory technique, as well as methodology interpretation, particularly if they had not been in such a laboratory environment before. The virtual learning environment aided learning as it could be utilized for longer than a typical physical laboratory class, thus allowing further time on understanding techniques.Keywords: chemistry, food science, future pedagogy, STEM education
Procedia PDF Downloads 16720344 Qualitative Narrative Framework as Tool for Reduction of Stigma and Prejudice
Authors: Anastasia Schnitzer, Oliver Rehren
Abstract:
Mental health has become an increasingly important topic in society in recent years, not least due to the challenges posed by the corona pandemic. Along with this, the public has become more and more aware that a lack of enlightenment and proper coping mechanisms may result in a notable risk to develop mental disorders. Yet, there are still many biases against those affected, which are further connected to issues of stigmatization and societal exclusion. One of the main strategies to combat these forms of prejudice and stigma is to induce intergroup contact. More specifically, the Intergroup Contact Theory states engaging in certain types of contact with members of marginalized groups may be an effective way to improve attitudes towards these groups. However, due to the persistent prejudice and stigmatization, affected individuals often do not dare to speak openly about their mental disorders, so that intergroup contact often goes unnoticed. As a result, many people only experience conscious contact with individuals with a mental disorder through media. As an analogy to the Intergroup Contact Theory, the Parasocial Contact Hypothesis proposes that repeatedly being exposed to positive media representations of outgroup members can lead to a reduction of negative prejudices and attitudes towards this outgroup. While there is a growing body of research on the merit of this mechanism, measurements often only consist of 'positive' or 'negative' parasocial contact conditions (or examine the valence or quality of the previous contact with the outgroup); meanwhile, more specific conditions are often neglected. The current study aims to tackle this shortcoming. By scrutinizing the potential of contemporary series as a narrative framework of high quality, we strive to elucidate more detailed aspects of beneficial parasocial contact -for the sake of reducing prejudice and stigma towards individuals with mental disorders. Thus, a two-factorial between-subject online panel study with three measurement points was conducted (N = 95). Participants were randomly assigned to one of two groups, having to watch episodes of either a series with a narrative framework of high (Quality-TV) or low quality (Continental-TV), with one-week interval in-between the episodes. Suitable series were determined with the help of a pretest. Prejudice and stigma towards people with mental disorders were measured at the beginning of the study, before and after each episode, and in a final follow-up one week after the last two episodes. Additionally, parasocial interaction (PSI), quality of contact (QoC), and transportation were measured several times. Based on these data, multivariate multilevel analyses were performed in R using the lavaan package. Latent growth models showed moderate to high increases in QoC and PSI as well as small to moderate decreases in stigma and prejudice over time. Multilevel path analysis with individual and group levels further revealed that a qualitative narrative framework leads to a higher quality of contact experience, which then leads to lower prejudice and stigma, with effects ranging from moderate to high.Keywords: prejudice, quality of contact, parasocial contact, narrative framework
Procedia PDF Downloads 8320343 Atmospheric Circulation Drivers Of Nationally-Aggregated Wind Energy Production Over Greece
Authors: Kostas Philippopoulos, Chris G. Tzanis, Despina Deligiorgi
Abstract:
Climate change adaptation requires the exploitation of renewable energy sources such as wind. However, climate variability can affect the regional wind energy potential and consequently the available wind power production. The goal of the research project is to examine the impact of atmospheric circulation on wind energy production over Greece. In the context of synoptic climatology, the proposed novel methodology employs Self-Organizing Maps for grouping and classifying the atmospheric circulation and nationally-aggregated capacity factor time series for a 30-year period. The results indicate the critical effect of atmospheric circulation on the national aggregated wind energy production values and therefore address the issue of optimum distribution of wind farms for a specific region.Keywords: wind energy, atmospheric circulation, capacity factor, self-organizing maps
Procedia PDF Downloads 16020342 Study of the Physical Aging of Polyvinyl Chloride (PVC)
Authors: Mohamed Ouazene
Abstract:
The insulating properties of the polymers are widely used in electrical engineering for the production of insulators and various supports, as well as for the insulation of electric cables for medium and high voltage, etc. These polymeric materials have significant advantages both technically and economically. However, although the insulation with polymeric materials has advantages, there are also certain disadvantages such as the influence of the heat which can have a detrimental effect on these materials. Polyvinyl chloride (PVC) is one of the polymers used in a plasticized state in the cable insulation to medium and high voltage. The studied material is polyvinyl chloride (PVC 4000 M) from the Algerian national oil company whose formula is: Industrial PVC 4000 M is in the form of white powder. The test sample is a pastille of 1 mm thick and 1 cm in diameter. The consequences of increasing the temperature of a polymer are modifications; some of them are reversible and others irreversible [1]. The reversible changes do not affect the chemical composition of the polymer, or its structure. They are characterized by transitions and relaxations. The glass transition temperature is an important feature of a polymer. Physical aging of PVC is to maintain the material for a longer or shorter time to its glass transition temperature. The aim of this paper is to study this phenomenon by the method of thermally stimulated depolarization currents. Relaxations within the polymer have been recorded in the form of current peaks. We have found that the intensity decreases for more residence time in the polymer along its glass transition temperature. Furthermore, it is inferred from this work that the phenomenon of physical aging can have important consequences on the properties of the polymer. It leads to a more compact rearrangement of the material and a reconstruction or reinforcement of structural connections.Keywords: depolarization currents, glass transition temperature, physical aging, polyvinyl chloride (PVC)
Procedia PDF Downloads 38620341 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods
Authors: Cristina Vatamanu, Doina Cosovan, Dragos Gavrilut, Henri Luchian
Abstract:
In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through semi-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.Keywords: ensembles, false positives, feature selection, one side class algorithm
Procedia PDF Downloads 29120340 Synchronous Reference Frame and Instantaneous P-Q Theory Based Control of Unified Power Quality Conditioner for Power Quality Improvement of Distribution System
Authors: Ambachew Simreteab Gebremedhn
Abstract:
Context: The paper explores the use of synchronous reference frame theory (SRFT) and instantaneous reactive power theory (IRPT) based control of Unified Power Quality Conditioner (UPQC) for improving power quality in distribution systems. Research Aim: To investigate the performance of different control configurations of UPQC using SRFT and IRPT for mitigating power quality issues in distribution systems. Methodology: The study compares three control techniques (SRFT-IRPT, SRFT-SRFT, IRPT-IRPT) implemented in series and shunt active filters of UPQC. Data is collected under various control algorithms to analyze UPQC performance. Findings: Results indicate the effectiveness of SRFT and IRPT based control techniques in addressing power quality problems such as voltage sags, swells, unbalance, harmonics, and current harmonics in distribution systems. Theoretical Importance: The study provides insights into the application of SRFT and IRPT in improving power quality, specifically in mitigating unbalanced voltage sags, where conventional methods fall short. Data Collection: Data is collected under various control algorithms using simulation in MATLAB Simulink and real-time operation executed with experimental results obtained using RT-LAB. Analysis Procedures: Performance analysis of UPQC under different control algorithms is conducted to evaluate the effectiveness of SRFT and IRPT based control techniques in mitigating power quality issues. Questions Addressed: How do SRFT and IRPT based control techniques compare in improving power quality in distribution systems? What is the impact of using different control configurations on the performance of UPQC? Conclusion: The study demonstrates the efficacy of SRFT and IRPT based control of UPQC in mitigating power quality issues in distribution systems, highlighting their potential for enhancing voltage and current quality.Keywords: power quality, UPQC, shunt active filter, series active filter, non-linear load, RT-LAB, MATLAB
Procedia PDF Downloads 620339 Comparison of Machine Learning Models for the Prediction of System Marginal Price of Greek Energy Market
Authors: Ioannis P. Panapakidis, Marios N. Moschakis
Abstract:
The Greek Energy Market is structured as a mandatory pool where the producers make their bid offers in day-ahead basis. The System Operator solves an optimization routine aiming at the minimization of the cost of produced electricity. The solution of the optimization problem leads to the calculation of the System Marginal Price (SMP). Accurate forecasts of the SMP can lead to increased profits and more efficient portfolio management from the producer`s perspective. Aim of this study is to provide a comparative analysis of various machine learning models such as artificial neural networks and neuro-fuzzy models for the prediction of the SMP of the Greek market. Machine learning algorithms are favored in predictions problems since they can capture and simulate the volatilities of complex time series.Keywords: deregulated energy market, forecasting, machine learning, system marginal price
Procedia PDF Downloads 214