Search results for: task cycles
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2825

Search results for: task cycles

875 Design of a Virtual Reality System for Children with Developmental Coordination Disorder

Authors: Ya-Ju Ju, Li-Chen Yang, Yi-Chun Du, Rong-Ju Cherng

Abstract:

Introduction: It is estimated that 5-6% of school-aged children may be diagnosed to have developmental coordination disorder (DCD). Children with DCD are characterized with motor skill difficulty which cannot be explained by any medical or intellectual reasons. Such motor difficulties limit children’s participation to sports activity, further affect their physical fitness, cardiopulmonary function and balance, and may lead to obesity. The purpose of the project was to develop an exergaming system for children with DCD aiming to improve their physical fitness, cardiopulmonary function and balance ability. Methods: This study took five steps to build up the system: system planning, tasks selection, tasks programming, system integration and usability test. The system basically adopted virtual reality technique to integrate self-developed training programs. The training programs were developed to brainstorm among team members and after literature review. The selected tasks for training in the system were a combination of fundamental movement tor skill. Results and Discussion: Based on the theory of motor development, we design the training task from easy ones to hard ones, from single tasks to dual tasks. The tasks included walking, sit to stand, jumping, kicking, weight shifting, side jumping and their combination. Preliminary study showed that the tasks presented an order of development. Further study is needed to examine its effect on motor skill and cardiovascular fitness in children with DCD.

Keywords: virtual reality, virtual reality system, developmental coordination disorder, children

Procedia PDF Downloads 97
874 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals

Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty

Abstract:

A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs, and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine-learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient but not the magnitude. A neural network with two hidden layers were then used to learn the coefficient magnitudes along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.

Keywords: quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction

Procedia PDF Downloads 88
873 Conceptual Solution and Thermal Analysis of the Final Cooling Process of Biscuits in One Confectionary Factory in Serbia

Authors: Duško Salemović, Aleksandar Dedić, Matilda Lazić, Dragan Halas

Abstract:

The paper presents the conceptual solution for the final cooling of the chocolate dressing of biscuits in one confectionary factory in Serbia. The proposed concept solution was derived from the desired technological process of final cooling of biscuits and the required process parameters that were to be achieved, and which were an integral part of the project task. The desired process parameters for achieving proper hardening and coating formation are the exchanged amount of heat in the time unit between the two media (air and chocolate dressing), the speed of air inside the tunnel cooler, and the surface of all biscuits in contact with the air. These parameters were calculated in the paper. The final cooling of chocolate dressing on biscuits could be optimized by changing process parameters and dimensions of the tunnel cooler and looking for the appropriate values for them. The accurate temperature predictions and fluid flow analysis could be conducted by using heat balance and flow balance equations, having in mind the theory of similarity. Furthermore, some parameters were adopted from previous technology processes, such as the inlet temperature of biscuits and input air temperature. A thermal calculation was carried out, and it was demonstrated that the percentage error between the contact surface of the air and the chocolate biscuit topping, which is obtained from the heat balance and geometrically through the proposed conceptual solution, does not exceed 0.67%, which is a very good agreement. This enabled the quality of the cooling process of chocolate dressing applied on the biscuit and the hardness of its coating.

Keywords: chocolate dressing, air, cooling, heat balance

Procedia PDF Downloads 56
872 Impact of Education on Levels of Physical Activity and Depression in Taiwanese Vegetarians and Omnivores

Authors: Ya-Lin Chang, Chia Chen Chang, Yu-Ru Liang, Joyce Chen, You-Kang Chang, Tina Chiu

Abstract:

Physical activity and mental health status are important for health. The purpose of this study was to examine levels of physical activities and depression in Taiwanese vegetarians (VEG) and omnivores (OMNI). Sixty-three vegetarians (20 males) and 56 omnivores (23 males) with an average age of 51 years were recruited for a food frequency validation study at Taipei Tzu Chi Hospital from July to September in 2016. Participants filled out a validated Chinese version international physical activity questionnaire-short-form (IPAQ), Beck Depression Inventory-II-Chinese version (BDI), food frequency questionnaire (FFQ) and a questionnaire on demographics and medical history upon recruitment. Total BDI scores were calculated for depression and the metabolic equivalent of task (MET) was calculated for physical activity levels. Mann-Whitney U tests and Chi-square test were used to compare demographics, physical activity levels and depression scores. VEG and OMNI did not differ significantly on MET (1441.9 ± 3387.3 vs. 1605.8 ± 2486.1. p=0.2652, respectively). VEG scored slightly lower on BDI compared to OMNI without statistical significance (5.6 ± 5.7 vs. 7.4 ± 6.3. p=0.06). In addition, we found that regardless of diet practice, those who held a college degree and above scored better on MET (1788.1 ± 2532.6 vs. 1215.5 ± 3425.5. p=0.0014) and BDI (5.2 ± 5.1 vs. 7.8 ± 6.7. p=0.03). In this cross-sectional study, Taiwanese vegetarians and omnivores scored comparatively on physical activity levels and depression. However, education is a significant determinant of physical activity and depression.

Keywords: BDI, diet, education, physical activity

Procedia PDF Downloads 369
871 Multi-Stream Graph Attention Network for Recommendation with Knowledge Graph

Authors: Zhifei Hu, Feng Xia

Abstract:

In recent years, Graph neural network has been widely used in knowledge graph recommendation. The existing recommendation methods based on graph neural network extract information from knowledge graph through entity and relation, which may not be efficient in the way of information extraction. In order to better propose useful entity information for the current recommendation task in the knowledge graph, we propose an end-to-end Neural network Model based on multi-stream graph attentional Mechanism (MSGAT), which can effectively integrate the knowledge graph into the recommendation system by evaluating the importance of entities from both users and items. Specifically, we use the attention mechanism from the user's perspective to distil the domain nodes information of the predicted item in the knowledge graph, to enhance the user's information on items, and generate the feature representation of the predicted item. Due to user history, click items can reflect the user's interest distribution, we propose a multi-stream attention mechanism, based on the user's preference for entities and relationships, and the similarity between items to be predicted and entities, aggregate user history click item's neighborhood entity information in the knowledge graph and generate the user's feature representation. We evaluate our model on three real recommendation datasets: Movielens-1M (ML-1M), LFM-1B 2015 (LFM-1B), and Amazon-Book (AZ-book). Experimental results show that compared with the most advanced models, our proposed model can better capture the entity information in the knowledge graph, which proves the validity and accuracy of the model.

Keywords: graph attention network, knowledge graph, recommendation, information propagation

Procedia PDF Downloads 96
870 An Examination of the Impact of Sand Dunes on Soils, Vegetation and Water Resources as the Major Means of Livelihood in Gada Local Government Area of Sokoto State, Nigeria

Authors: Abubakar Aminu

Abstract:

Sand dunes, as a major product of desertification, is well known to affect soil resources, water resources and vegetation, especially in arid and semi-arid region; this scenario disrupt the livelihood security of people in the affected areas. The research assessed the episode of sand dune accumulation on water resources, soil and vegetation in Gada local government of Sokoto State, Nigeria. In this paper, both qualitative and quantitative methods were used to generate data which was analyzed and discussed. The finding of the paper shows that livelihood was affected by accumulations of sand dunes as water resources and soils were affected negatively thereby reducing crop yields and making livestock domestication a very difficult and expensive task; the finding also shows that 60% of the respondents agreed to planting of trees as the major solution to combat sand dunes accumulation. However, the soil parameters tested indicated low Organic carbon, low Nitrogen, low Potassium, Calcium and Phosphorus but higher values were recorded in Sodium and Cation exchange capacity which served as evidence of the high or strong aridity nature of the soil in the area. In line with the above, the researcher recommended a massive tree planting campaign to curtail desertification as well as using organic manures for higher agricultural yield and as such, improvement in livelihood security.

Keywords: soils, vegetatio, water, desertification

Procedia PDF Downloads 49
869 LanE-change Path Planning of Autonomous Driving Using Model-Based Optimization, Deep Reinforcement Learning and 5G Vehicle-to-Vehicle Communications

Authors: William Li

Abstract:

Lane-change path planning is a crucial and yet complex task in autonomous driving. The traditional path planning approach based on a system of carefully-crafted rules to cover various driving scenarios becomes unwieldy as more and more rules are added to deal with exceptions and corner cases. This paper proposes to divide the entire path planning to two stages. In the first stage the ego vehicle travels longitudinally in the source lane to reach a safe state. In the second stage the ego vehicle makes lateral lane-change maneuver to the target lane. The paper derives the safe state conditions based on lateral lane-change maneuver calculation to ensure collision free in the second stage. To determine the acceleration sequence that minimizes the time to reach a safe state in the first stage, the paper proposes three schemes, namely, kinetic model based optimization, deep reinforcement learning, and 5G vehicle-to-vehicle (V2V) communications. The paper investigates these schemes via simulation. The model-based optimization is sensitive to the model assumptions. The deep reinforcement learning is more flexible in handling scenarios beyond the model assumed by the optimization. The 5G V2V eliminates uncertainty in predicting future behaviors of surrounding vehicles by sharing driving intents and enabling cooperative driving.

Keywords: lane change, path planning, autonomous driving, deep reinforcement learning, 5G, V2V communications, connected vehicles

Procedia PDF Downloads 201
868 DNA Hypomethylating Agents Induced Histone Acetylation Changes in Leukemia

Authors: Sridhar A. Malkaram, Tamer E. Fandy

Abstract:

Purpose: 5-Azacytidine (5AC) and decitabine (DC) are DNA hypomethylating agents. We recently demonstrated that both drugs increase the enzymatic activity of the histone deacetylase enzyme SIRT6. Accordingly, we are comparing the changes H3K9 acetylation changes in the whole genome induced by both drugs using leukemia cells. Description of Methods & Materials: Mononuclear cells from the bone marrow of six de-identified naive acute myeloid leukemia (AML) patients were cultured with either 500 nM of DC or 5AC for 72 h followed by ChIP-Seq analysis using a ChIP-validated acetylated-H3K9 (H3K9ac) antibody. Chip-Seq libraries were prepared from treated and untreated cells using SMARTer ThruPLEX DNA- seq kit (Takara Bio, USA) according to the manufacturer’s instructions. Libraries were purified and size-selected with AMPure XP beads at 1:1 (v/v) ratio. All libraries were pooled prior to sequencing on an Illumina HiSeq 1500. The dual-indexed single-read Rapid Run was performed with 1x120 cycles at 5 pM final concentration of the library pool. Sequence reads with average Phred quality < 20, with length < 35bp, PCR duplicates, and those aligning to blacklisted regions of the genome were filtered out using Trim Galore v0.4.4 and cutadapt v1.18. Reads were aligned to the reference human genome (hg38) using Bowtie v2.3.4.1 in end-to-end alignment mode. H3K9ac enriched (peak) regions were identified using diffReps v1.55.4 software using input samples for background correction. The statistical significance of differential peak counts was assessed using a negative binomial test using all individuals as replicates. Data & Results: The data from the six patients showed significant (Padj<0.05) acetylation changes at 925 loci after 5AC treatment versus 182 loci after DC treatment. Both drugs induced H3K9 acetylation changes at different chromosomal regions, including promoters, coding exons, introns, and distal intergenic regions. Ten common genes showed H3K9 acetylation changes by both drugs. Approximately 84% of the genes showed an H3K9 acetylation decrease by 5AC versus 54% only by DC. Figures 1 and 2 show the heatmaps for the top 100 genes and the 99 genes showing H3K9 acetylation decrease after 5AC treatment and DC treatment, respectively. Conclusion: Despite the similarity in hypomethylating activity and chemical structure, the effect of both drugs on H3K9 acetylation change was significantly different. More changes in H3K9 acetylation were observed after 5 AC treatments compared to DC. The impact of these changes on gene expression and the clinical efficacy of these drugs requires further investigation.

Keywords: DNA methylation, leukemia, decitabine, 5-Azacytidine, epigenetics

Procedia PDF Downloads 130
867 Application of Griddization Management to Construction Hazard Management

Authors: Lingzhi Li, Jiankun Zhang, Tiantian Gu

Abstract:

Hazard management that can prevent fatal accidents and property losses is a fundamental process during the buildings’ construction stage. However, due to lack of safety supervision resources and operational pressures, the conduction of hazard management is poor and ineffective in China. In order to improve the quality of construction safety management, it is critical to explore the use of information technologies to ensure that the process of hazard management is efficient and effective. After exploring the existing problems of construction hazard management in China, this paper develops the griddization management model for construction hazard management. First, following the knowledge grid infrastructure, the griddization computing infrastructure for construction hazards management is designed which includes five layers: resource entity layer, information management layer, task management layer, knowledge transformation layer and application layer. This infrastructure will be as the technical support for realizing grid management. Second, this study divides the construction hazards into grids through city level, district level and construction site level according to grid principles. Last, a griddization management process including hazard identification, assessment and control is developed. Meanwhile, all stakeholders of construction safety management, such as owners, contractors, supervision organizations and government departments, should take the corresponding responsibilities in this process. Finally, a case study based on actual construction hazard identification, assessment and control is used to validate the effectiveness and efficiency of the proposed griddization management model. The advantage of this designed model is to realize information sharing and cooperative management between various safety management departments.

Keywords: construction hazard, griddization computing, grid management, process

Procedia PDF Downloads 256
866 Mobile Augmented Reality for Collaboration in Operation

Authors: Chong-Yang Qiao

Abstract:

Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.

Keywords: mobile augmented reality, remote collaboration, user experience, cognition model

Procedia PDF Downloads 181
865 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models

Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai

Abstract:

Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.

Keywords: plant identification, CNN, image processing, vision transformer, classification

Procedia PDF Downloads 77
864 Reinforcement Learning for Robust Missile Autopilot Design: TRPO Enhanced by Schedule Experience Replay

Authors: Bernardo Cortez, Florian Peter, Thomas Lausenhammer, Paulo Oliveira

Abstract:

Designing missiles’ autopilot controllers have been a complex task, given the extensive flight envelope and the nonlinear flight dynamics. A solution that can excel both in nominal performance and in robustness to uncertainties is still to be found. While Control Theory often debouches into parameters’ scheduling procedures, Reinforcement Learning has presented interesting results in ever more complex tasks, going from videogames to robotic tasks with continuous action domains. However, it still lacks clearer insights on how to find adequate reward functions and exploration strategies. To the best of our knowledge, this work is a pioneer in proposing Reinforcement Learning as a framework for flight control. In fact, it aims at training a model-free agent that can control the longitudinal non-linear flight dynamics of a missile, achieving the target performance and robustness to uncertainties. To that end, under TRPO’s methodology, the collected experience is augmented according to HER, stored in a replay buffer and sampled according to its significance. Not only does this work enhance the concept of prioritized experience replay into BPER, but it also reformulates HER, activating them both only when the training progress converges to suboptimal policies, in what is proposed as the SER methodology. The results show that it is possible both to achieve the target performance and to improve the agent’s robustness to uncertainties (with low damage on nominal performance) by further training it in non-nominal environments, therefore validating the proposed approach and encouraging future research in this field.

Keywords: Reinforcement Learning, flight control, HER, missile autopilot, TRPO

Procedia PDF Downloads 247
863 Structural Health Assessment of a Masonry Bridge Using Wireless

Authors: Nalluri Lakshmi Ramu, C. Venkat Nihit, Narayana Kumar, Dillep

Abstract:

Masonry bridges are the iconic heritage transportation infrastructure throughout the world. Continuous increase in traffic loads and speed have kept engineers in dilemma about their structural performance and capacity. Henceforth, research community has an urgent need to propose an effective methodology and validate on real-time bridges. The presented research aims to assess the structural health of an Eighty-year-old masonry railway bridge in India using wireless accelerometer sensors. The bridge consists of 44 spans with length of 24.2 m each and individual pier is 13 m tall laid on well foundation. To calculate the dynamic characteristic properties of the bridge, ambient vibrations were recorded from the moving traffic at various speeds and the same are compared with the developed three-dimensional numerical model using finite element-based software. The conclusions about the weaker or deteriorated piers are drawn from the comparison of frequencies obtained from the experimental tests conducted on alternative spans. Masonry is a heterogeneous anisotropic material made up of incoherent materials (such as bricks, stones, and blocks). It is most likely the earliest largely used construction material. Masonry bridges, which were typically constructed of brick and stone, are still a key feature of the world's highway and railway networks. There are 1,47,523 railway bridges across India and about 15% of these bridges are built by masonry, which are around 80 to 100 year old. The cultural significance of masonry bridges cannot be overstated. These bridges are considered to be complicated due to the presence of arches, spandrel walls, piers, foundations, and soils. Due to traffic loads and vibrations, wind, rain, frost attack, high/low temperature cycles, moisture, earthquakes, river overflows, floods, scour, and soil under their foundations may cause material deterioration, opening of joints and ring separation in arch barrels, cracks in piers, loss of brick-stones and mortar joints, distortion of the arch profile. Few NDT tests like Flat jack Tests are being employed to access the homogeneity, durability of masonry structure, however there are many drawbacks because of the test. A modern approach of structural health assessment of masonry structures by vibration analysis, frequencies and stiffness properties is being explored in this paper.

Keywords: masonry bridges, condition assessment, wireless sensors, numerical analysis modal frequencies

Procedia PDF Downloads 154
862 Cartilage Mimicking Coatings to Increase the Life-Span of Bearing Surfaces in Joint Prosthesis

Authors: L. Sánchez-Abella, I. Loinaz, H-J. Grande, D. Dupin

Abstract:

Aseptic loosening remains as the principal cause of revision in total hip arthroplasty (THA). For long-term implantations, submicron particles are generated in vivo due to the inherent wear of the prosthesis. When this occurs, macrophages undergo phagocytosis and secretion of bone resorptive cytokines inducing osteolysis, hence loosening of the implanted prosthesis. Therefore, new technologies are required to reduce the wear of the bearing materials and hence increase the life-span of the prosthesis. Our strategy focuses on surface modification of the bearing materials with a hydrophilic coating based on cross-linked water-soluble (meth)acrylic monomers to improve their tribological behavior. These coatings are biocompatible, with high swelling capacity and antifouling properties, mimicking the properties of natural cartilage, i.e. wear resistance with a permanent hydrated layer that prevents prosthesis damage. Cartilage mimicking based coatings may be also used to protect medical device surfaces from damage and scratches that will compromise their integrity and hence their safety. However, there are only a few reports on the mechanical and tribological characteristics of this type of coatings. Clear beneficial advantages of this coating have been demonstrated in different conditions and different materials, such as Ultra-high molecular weight polyethylene (UHMWPE), Polyethylene (XLPE), Carbon-fiber-reinforced polyetheretherketone (CFR-PEEK), cobalt-chromium (CoCr), Stainless steel, Zirconia Toughened Alumina (ZTA) and Alumina. Using routine tribological experiments, the wear for UHMWPE substrate was decreased by 75% against alumina, ZTA and stainless steel. For PEEK-CFR substrate coated, the amount of material lost against ZTA and CrCo was at least 40% lower. Experiments on hip simulator allowed coated ZTA femoral heads and coated UHMWPE cups to be validated with a decrease of 80% of loss material. Further experiments on hip simulator adding abrasive particles (1 micron sized alumina particles) during 3 million cycles, on a total of 6 million, demonstrated a decreased of around 55% of wear compared to uncoated UHMWPE and uncoated XLPE. In conclusion, CIDETEC‘s hydrogel coating technology is versatile and can be adapted to protect a large range of surfaces, even in abrasive conditions.

Keywords: cartilage, hydrogel, hydrophilic coating, joint

Procedia PDF Downloads 103
861 Using an Empathy Intervention Model to Enhance Empathy and Socially Shared Regulation in Youth with Autism Spectrum Disorder

Authors: Yu-Chi Chou

Abstract:

The purpose of this study was to establish a logical path of an instructional model of empathy and social regulation, providing feasibility evidence on the model implementation in students with autism spectrum disorder (ASD). This newly developed Emotional Bug-Out Bag (BoB) curriculum was designed to enhance the empathy and socially shared regulation of students with ASD. The BoB model encompassed three instructional phases of basic theory lessons (BTL), action plan practices (APP), and final theory practices (FTP) during implementation. Besides, a learning flow (teacher-directed instruction, student self-directed problem-solving, group-based task completion, group-based reflection) was infused into the progress of instructional phases to deliberately promote the social regulatory process in group-working activities. A total of 23 junior high school students with ASD were implemented with the BoB curriculum. To examine the logical path for model implementation, data was collected from the participating students’ self-report scores on the learning nodes and understanding questions. Path analysis using structural equation modeling (SEM) was utilized for analyzing scores on 10 learning nodes and 41 understanding questions through the three phases of the BoB model. Results showed (a) all participants progressed throughout the implementation of the BoB model, and (b) the models of learning nodes and phases were positive and significant as expected, confirming the hypothesized logic path of this curriculum.

Keywords: autism spectrum disorder, empathy, regulation, socially shared regulation

Procedia PDF Downloads 46
860 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 392
859 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 289
858 Optical Assessment of Marginal Sealing Performance around Restorations Using Swept-Source Optical Coherence Tomography

Authors: Rima Zakzouk, Yasushi Shimada, Yasunori Sumi, Junji Tagami

Abstract:

Background and purpose: The resin composite has become the main material for the restorations of caries in recent years due to aesthetic characteristics, especially with the development of the adhesive techniques. The quality of adhesion to tooth structures is depending on an exchange process between inorganic tooth material and synthetic resin and a micromechanical retention promoted by resin infiltration in partially demineralized dentin. Optical coherence tomography (OCT) is a noninvasive diagnostic method for obtaining cross-sectional images that produce high-resolution of the biological tissue at the micron scale. The aim of this study was to evaluate the gap formation at adhesive/tooth interface of two-step self-etch adhesives that are preceded with or without phosphoric acid pre-etching in different regions of teeth using SS-OCT. Materials and methods: Round tapered cavities (2×2 mm) were prepared in cervical part of bovine incisors teeth and divided into 2 groups (n=10): first group self-etch adhesive (Clearfil SE Bond) was applied for SE group and second group treated with acid etching before applying the self-etch adhesive for PA group. Subsequently, both groups were restored with Estelite Flow Quick Flowable Composite Resin and observed under OCT. Following 5000 thermal cycles, the same section was obtained again for each cavity using OCT at 1310-nm wavelength. Scanning was repeated after two months to monitor the gap progress. Then the gap length was measured using image analysis software, and the statistics analysis were done between both groups using SPSS software. After that, the cavities were sectioned and observed under Confocal Laser Scanning Microscope (CLSM) to confirm the result of OCT. Results: Gaps formed at the bottom of the cavity was longer than the gap formed at the margin and dento-enamel junction in both groups. On the other hand, pre-etching treatment led to damage the DEJ regions creating longer gap. After 2 months the results showed almost progress in the gap length significantly at the bottom regions in both groups. In conclusions, phosphoric acid etching treatment did not reduce the gap lrngth in most regions of the cavity. Significance: The bottom region of tooth was more exposed to gap formation than margin and DEJ regions, The DEJ damaged with phosphoric acid treatment.

Keywords: optical coherence tomography, self-etch adhesives, bottom, dento enamel junction

Procedia PDF Downloads 204
857 Study and Solving High Complex Non-Linear Differential Equations Applied in the Engineering Field by Analytical New Approach AGM

Authors: Mohammadreza Akbari, Sara Akbari, Davood Domiri Ganji, Pooya Solimani, Reza Khalili

Abstract:

In this paper, three complicated nonlinear differential equations(PDE,ODE) in the field of engineering and non-vibration have been analyzed and solved completely by new method that we have named it Akbari-Ganji's Method (AGM) . As regards the previous published papers, investigating this kind of equations is a very hard task to do and the obtained solution is not accurate and reliable. This issue will be emerged after comparing the achieved solutions by Numerical Method. Based on the comparisons which have been made between the gained solutions by AGM and Numerical Method (Runge-Kutta 4th), it is possible to indicate that AGM can be successfully applied for various differential equations particularly for difficult ones. Furthermore, It is necessary to mention that a summary of the excellence of this method in comparison with the other approaches can be considered as follows: It is noteworthy that these results have been indicated that this approach is very effective and easy therefore it can be applied for other kinds of nonlinear equations, And also the reasons of selecting the mentioned method for solving differential equations in a wide variety of fields not only in vibrations but also in different fields of sciences such as fluid mechanics, solid mechanics, chemical engineering, etc. Therefore, a solution with high precision will be acquired. With regard to the afore-mentioned explanations, the process of solving nonlinear equation(s) will be very easy and convenient in comparison with the other methods. And also one of the important position that is explored in this paper is: Trigonometric and exponential terms in the differential equation (the method AGM) , is no need to use Taylor series Expansion to enhance the precision of the result.

Keywords: new method (AGM), complex non-linear partial differential equations, damping ratio, energy lost per cycle

Procedia PDF Downloads 447
856 Judicial Review of Indonesia's Position as the First Archipelagic State to implement the Traffic Separation Scheme to Establish Maritime Safety and Security

Authors: Rosmini Yanti, Safira Aviolita, Marsetio

Abstract:

Indonesia has several straits that are very important as a shipping lane, including the Sunda Strait and the Lombok Strait, which are the part of the Indonesian Archipelagic Sea Lane (IASL). An increase in traffic on the Marine Archipelago makes the task of monitoring sea routes increasingly difficult. Indonesia has proposed the establishment of a Traffic Separation Scheme (TSS) in the Sunda Strait and the Lombok Strait and the country now has the right to be able to conceptualize the TSS as well as the obligation to regulate it. Indonesia has the right to maintain national safety and sovereignty. In setting the TSS, Indonesia needs to issue national regulations that are in accordance with international law and the general provisions of the IMO (International Maritime Organization) can then be used as guidelines for maritime safety and security in the Sunda Strait and the Lombok Strait. The research method used is a qualitative method with the concept of linguistic and visual data collection. The source of the data is the analysis of documents and regulations. The results show that the determination of TSS was justified by International Law, in accordance with article 22, article 41, and article 53 of the United Nations Convention on the Law of the Sea (UNCLOS) 1982. The determination of TSS by the Indonesian government would be in accordance with COLREG (International Convention on Preventing Collisions at Sea) 10, which has been designed to follow IASL. Thus, TSS can provide a function as a safety and monitoring medium to minimize ship accidents or collisions, including the warship and aircraft of other countries that cross the IASL.

Keywords: archipelago state, maritime law, maritime security, traffic separation scheme

Procedia PDF Downloads 109
855 Using Deep Learning for the Detection of Faulty RJ45 Connectors on a Radio Base Station

Authors: Djamel Fawzi Hadj Sadok, Marrone Silvério Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner

Abstract:

A radio base station (RBS), part of the radio access network, is a particular type of equipment that supports the connection between a wide range of cellular user devices and an operator network access infrastructure. Nowadays, most of the RBS maintenance is carried out manually, resulting in a time consuming and costly task. A suitable candidate for RBS maintenance automation is repairing faulty links between devices caused by missing or unplugged connectors. A suitable candidate for RBS maintenance automation is repairing faulty links between devices caused by missing or unplugged connectors. This paper proposes and compares two deep learning solutions to identify attached RJ45 connectors on network ports. We named connector detection, the solution based on object detection, and connector classification, the one based on object classification. With the connector detection, we get an accuracy of 0:934, mean average precision 0:903. Connector classification, get a maximum accuracy of 0:981 and an AUC of 0:989. Although connector detection was outperformed in this study, this should not be viewed as an overall result as connector detection is more flexible for scenarios where there is no precise information about the environment and the possible devices. At the same time, the connector classification requires that information to be well-defined.

Keywords: radio base station, maintenance, classification, detection, deep learning, automation

Procedia PDF Downloads 176
854 A Framework for Secure Information Flow Analysis in Web Applications

Authors: Ralph Adaimy, Wassim El-Hajj, Ghassen Ben Brahim, Hazem Hajj, Haidar Safa

Abstract:

Huge amounts of data and personal information are being sent to and retrieved from web applications on daily basis. Every application has its own confidentiality and integrity policies. Violating these policies can have broad negative impact on the involved company’s financial status, while enforcing them is very hard even for the developers with good security background. In this paper, we propose a framework that enforces security-by-construction in web applications. Minimal developer effort is required, in a sense that the developer only needs to annotate database attributes by a security class. The web application code is then converted into an intermediary representation, called Extended Program Dependence Graph (EPDG). Using the EPDG, the provided annotations are propagated to the application code and run against generic security enforcement rules that were carefully designed to detect insecure information flows as early as they occur. As a result, any violation in the data’s confidentiality or integrity policies is reported. As a proof of concept, two PHP web applications, Hotel Reservation and Auction, were used for testing and validation. The proposed system was able to catch all the existing insecure information flows at their source. Moreover and to highlight the simplicity of the suggested approaches vs. existing approaches, two professional web developers assessed the annotation tasks needed in the presented case studies and provided a very positive feedback on the simplicity of the annotation task.

Keywords: web applications security, secure information flow, program dependence graph, database annotation

Procedia PDF Downloads 452
853 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 360
852 Micro-Rest: Extremely Short Breaks in Post-Learning Interference Support Memory Retention over the Long Term

Authors: R. Marhenke, M. Martini

Abstract:

The distraction of attentional resources after learning hinders long-term memory consolidation compared to several minutes of post-encoding inactivity in form of wakeful resting. We tested whether an 8-minute period of wakeful resting, compared to performing an adapted version of the d2 test of attention after learning, supports memory retention. Participants encoded and immediately recalled a word list followed by either an 8 minute period of wakeful resting (eyes closed, relaxed) or by performing an adapted version of the d2 test of attention (scanning and selecting specific characters while ignoring others). At the end of the experimental session (after 12-24 min) and again after 7 days, participants were required to complete a surprise free recall test of both word lists. Our results showed no significant difference in memory retention between the experimental conditions. However, we found that participants who completed the first lines of the d2 test in less than the given time limit of 20 seconds and thus had short unfilled intervals before switching to the next test line, remembered more words over the 12-24 minute and over the 7 days retention interval than participants who did not complete the first lines. This interaction occurred only for the first test lines, with the highest temporal proximity to the encoding task and not for later test lines. Differences in retention scores between groups (completed first line vs. did not complete) seem to be widely independent of the general performance in the d2 test. Implications and limitations of these exploratory findings are discussed.

Keywords: long-term memory, retroactive interference, attention, forgetting

Procedia PDF Downloads 113
851 Integrating System-Level Infrastructure Resilience and Sustainability Based on Fractal: Perspectives and Review

Authors: Qiyao Han, Xianhai Meng

Abstract:

Urban infrastructures refer to the fundamental facilities and systems that serve cities. Due to the global climate change and human activities in recent years, many urban areas around the world are facing enormous challenges from natural and man-made disasters, like flood, earthquake and terrorist attack. For this reason, urban resilience to disasters has attracted increasing attention from researchers and practitioners. Given the complexity of infrastructure systems and the uncertainty of disasters, this paper suggests that studies of resilience could focus on urban functional sustainability (in social, economic and environmental dimensions) supported by infrastructure systems under disturbance. It is supposed that urban infrastructure systems with high resilience should be able to reconfigure themselves without significant declines in critical functions (services), such as primary productivity, hydrological cycles, social relations and economic prosperity. Despite that some methods have been developed to integrate the resilience and sustainability of individual infrastructure components, more work is needed to enable system-level integration. This research presents a conceptual analysis framework for integrating resilience and sustainability based on fractal theory. It is believed that the ability of an ecological system to maintain structure and function in face of disturbance and to reorganize following disturbance-driven change is largely dependent on its self-similar and hierarchical fractal structure, in which cross-scale resilience is produced by the replication of ecosystem processes dominating at different levels. Urban infrastructure systems are analogous to ecological systems because they are interconnected, complex and adaptive, are comprised of interconnected components, and exhibit characteristic scaling properties. Therefore, analyzing resilience of ecological system provides a better understanding about the dynamics and interactions of infrastructure systems. This paper discusses fractal characteristics of ecosystem resilience, reviews literature related to system-level infrastructure resilience, identifies resilience criteria associated with sustainability dimensions, and develops a conceptual analysis framework. Exploration of the relevance of identified criteria to fractal characteristics reveals that there is a great potential to analyze infrastructure systems based on fractal. In the conceptual analysis framework, it is proposed that in order to be resilient, urban infrastructure system needs to be capable of “maintaining” and “reorganizing” multi-scale critical functions under disasters. Finally, the paper identifies areas where further research efforts are needed.

Keywords: fractal, urban infrastructure, sustainability, system-level resilience

Procedia PDF Downloads 255
850 NANCY: Combining Adversarial Networks with Cycle-Consistency for Robust Multi-Modal Image Registration

Authors: Mirjana Ruppel, Rajendra Persad, Amit Bahl, Sanja Dogramadzi, Chris Melhuish, Lyndon Smith

Abstract:

Multimodal image registration is a profoundly complex task which is why deep learning has been used widely to address it in recent years. However, two main challenges remain: Firstly, the lack of ground truth data calls for an unsupervised learning approach, which leads to the second challenge of defining a feasible loss function that can compare two images of different modalities to judge their level of alignment. To avoid this issue altogether we implement a generative adversarial network consisting of two registration networks GAB, GBA and two discrimination networks DA, DB connected by spatial transformation layers. GAB learns to generate a deformation field which registers an image of the modality B to an image of the modality A. To do that, it uses the feedback of the discriminator DB which is learning to judge the quality of alignment of the registered image B. GBA and DA learn a mapping from modality A to modality B. Additionally, a cycle-consistency loss is implemented. For this, both registration networks are employed twice, therefore resulting in images ˆA, ˆB which were registered to ˜B, ˜A which were registered to the initial image pair A, B. Thus the resulting and initial images of the same modality can be easily compared. A dataset of liver CT and MRI was used to evaluate the quality of our approach and to compare it against learning and non-learning based registration algorithms. Our approach leads to dice scores of up to 0.80 ± 0.01 and is therefore comparable to and slightly more successful than algorithms like SimpleElastix and VoxelMorph.

Keywords: cycle consistency, deformable multimodal image registration, deep learning, GAN

Procedia PDF Downloads 109
849 The Construction of the Bridge between Mrs Dalloway and to the Lighthouse: The Combination of Codes and Metaphors in the Structuring of the Plot in the Work of Virginia Woolf

Authors: María Rosa Mucci

Abstract:

Tzvetan Todorov (1971) designs a model of narrative transformation where the plot is constituted by difference and resemblance. This binary opposition is a synthesis of a central figure within narrative discourse: metaphor. Narrative operates as a metaphor since it combines different actions through similarities within a common plot. However, it sounds paradoxical that metonymy and not metaphor should be the key figure within the narrative. It is a metonymy that keeps the movement of actions within the story through syntagmatic relations. By the same token, this articulation of verbs makes it possible for the reader to engage in a dynamic interaction with the text, responding to the plot and mediating meanings with the contradictory external world. As Roland Barthes (1957) points out, there are two codes that are irreversible within the process: the codes of actions and the codes of enigmas. Virginia Woolf constructs her plots through a process of symbolism; a scene is always enduring, not only because it stands for something else but also because it connotes it. The reader is forced to elaborate the meaning at a mythological level beyond the lines. In this research, we follow a qualitative content analysis to code language through the proairetic (actions) and hermeneutic (enigmas) codes in terms of Barthes. There are two novels in particular that engage the reader in this process of construction: Mrs Dalloway (1925) and To the Lighthouse (1927). The bridge from the first to the second brings memories of childhood, allowing for the discovery of these enigmas hidden between the lines. What survives? Who survives? It is the reader's task to unravel these codes and rethink this dialogue between plot and reader to contribute to the predominance of texts and the textuality of narratives.

Keywords: metonymy, code, metaphor, myth, textuality

Procedia PDF Downloads 31
848 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement

Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini

Abstract:

Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.

Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis

Procedia PDF Downloads 119
847 NHS Tayside Plastic Surgery Induction Cheat Sheet and Video

Authors: Paul Holmes, Mike N. G.

Abstract:

Foundation-year doctors face increased stress, pressure and uncertainty when starting new rotations throughout their first years of work. This research questionnaire resulted in an induction cheat sheet and induction video that enhanced the Junior doctor's understanding of how to work effectively within the plastic surgery department at NHS Tayside. The objectives and goals were to improve the transition between cohorts of junior doctors in ward 26 at Ninewells Hospital. Before this quality improvement project, the induction pack was 74 pages long and over eight years old. With the support of consultant Mike Ng a new up-to-date induction was created. This involved a questionnaire and cheat sheet being developed. The questionnaire covered clerking, venipuncture, ward pharmacy, theatres, admissions, specialties on the ward, the cardiac arrest trolley, clinical emergencies, discharges and escalation. This audit has three completed cycles between August 2022 and August 2023. The cheat sheet developed a concise two-page A4 document designed for doctors to be able to reference easily and understand the essentials. The document format is a table containing ward layout; specialty; location; physician associate, shift patterns; ward rounds; handover location and time; hours coverage; senior escalation; nights; daytime duties, meetings/MDTs/board meetings, important bleeps and codes; department guidelines; boarders; referrals and patient stream; pharmacy; absences; rota coordinator; annual leave; top tips. The induction video is a 10-minute in-depth explanation of all aspects of the ward. The video explores in more depth the contents of the cheat sheet. This alternative visual format familiarizes the junior doctor with all aspects of the ward. These were provided to all foundation year 1 and 2 doctors on ward 26 at Ninewells Hospital at NHS Tayside Scotland. This work has since been adopted by the General Surgery Department, which extends to six further wards and has improved the effective handing over of the junior doctor’s role between cohorts. There is potential to further expand the cheat sheet to other departments as the concise document takes around 30 minutes to complete by a doctor who is currently on that ward. The time spent filling out the form provides vital information to the incoming junior doctors, which has a significant possibility to improve patient care.

Keywords: induction, junior doctor, handover, plastic surgery

Procedia PDF Downloads 68
846 Identification of High-Rise Buildings Using Object Based Classification and Shadow Extraction Techniques

Authors: Subham Kharel, Sudha Ravindranath, A. Vidya, B. Chandrasekaran, K. Ganesha Raj, T. Shesadri

Abstract:

Digitization of urban features is a tedious and time-consuming process when done manually. In addition to this problem, Indian cities have complex habitat patterns and convoluted clustering patterns, which make it even more difficult to map features. This paper makes an attempt to classify urban objects in the satellite image using object-oriented classification techniques in which various classes such as vegetation, water bodies, buildings, and shadows adjacent to the buildings were mapped semi-automatically. Building layer obtained as a result of object-oriented classification along with already available building layers was used. The main focus, however, lay in the extraction of high-rise buildings using spatial technology, digital image processing, and modeling, which would otherwise be a very difficult task to carry out manually. Results indicated a considerable rise in the total number of buildings in the city. High-rise buildings were successfully mapped using satellite imagery, spatial technology along with logical reasoning and mathematical considerations. The results clearly depict the ability of Remote Sensing and GIS to solve complex problems in urban scenarios like studying urban sprawl and identification of more complex features in an urban area like high-rise buildings and multi-dwelling units. Object-Oriented Technique has been proven to be effective and has yielded an overall efficiency of 80 percent in the classification of high-rise buildings.

Keywords: object oriented classification, shadow extraction, high-rise buildings, satellite imagery, spatial technology

Procedia PDF Downloads 131