Search results for: incidental information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13665

Search results for: incidental information processing

11325 Neurofeedback for Anorexia-RelaxNeuron-Aimed in Dissolving the Root Neuronal Cause

Authors: Kana Matsuyanagi

Abstract:

Anorexia Nervosa (AN) is a psychiatric disorder characterized by a relentless pursuit of thinness and strict restriction of food. The current therapeutic approaches for AN predominantly revolve around outpatient psychotherapies, which create significant financial barriers for the majority of affected patients, hindering their access to treatment. Nonetheless, AN exhibit one of the highest mortality and relapse rates among psychological disorders, underscoring the urgent need to provide patients with an affordable self-treatment tool, enabling those unable to access conventional medical intervention to address their condition autonomously. To this end, a neurofeedback software, termed RelaxNeuron, was developed with the objective of providing an economical and portable means to aid individuals in self-managing AN. Electroencephalography (EEG) was chosen as the preferred modality for RelaxNeuron, as it aligns with the study's goal of supplying a cost-effective and convenient solution for addressing AN. The primary aim of the software is to ameliorate the negative emotional responses towards food stimuli and the accompanying aberrant eye-tracking patterns observed in AN patient, ultimately alleviating the profound fear towards food an elemental symptom and, conceivably, the fundamental etiology of AN. The core functionality of RelaxNeuron hinges on the acquisition and analysis of EEG signals, alongside an electrocardiogram (ECG) signal, to infer the user's emotional state while viewing dynamic food-related imagery on the screen. Moreover, the software quantifies the user's performance in accurately tracking the moving food image. Subsequently, these two parameters undergo further processing in the subsequent algorithm, informing the delivery of either negative or positive feedback to the user. Preliminary test results have exhibited promising outcomes, suggesting the potential advantages of employing RelaxNeuron in the treatment of AN, as evidenced by its capacity to enhance emotional regulation and attentional processing through repetitive and persistent therapeutic interventions.

Keywords: Anorexia Nervosa, fear conditioning, neurofeedback, BCI

Procedia PDF Downloads 46
11324 Comparison of Artificial Neural Networks and Statistical Classifiers in Olive Sorting Using Near-Infrared Spectroscopy

Authors: İsmail Kavdır, M. Burak Büyükcan, Ferhat Kurtulmuş

Abstract:

Table olive is a valuable product especially in Mediterranean countries. It is usually consumed after some fermentation process. Defects happened naturally or as a result of an impact while olives are still fresh may become more distinct after processing period. Defected olives are not desired both in table olive and olive oil industries as it will affect the final product quality and reduce market prices considerably. Therefore it is critical to sort table olives before processing or even after processing according to their quality and surface defects. However, doing manual sorting has many drawbacks such as high expenses, subjectivity, tediousness and inconsistency. Quality criterions for green olives were accepted as color and free of mechanical defects, wrinkling, surface blemishes and rotting. In this study, it was aimed to classify fresh table olives using different classifiers and NIR spectroscopy readings and also to compare the classifiers. For this purpose, green (Ayvalik variety) olives were classified based on their surface feature properties such as defect-free, with bruised defect and with fly defect using FT-NIR spectroscopy and classification algorithms such as artificial neural networks, ident and cluster. Bruker multi-purpose analyzer (MPA) FT-NIR spectrometer (Bruker Optik, GmbH, Ettlingen Germany) was used for spectral measurements. The spectrometer was equipped with InGaAs detectors (TE-InGaAs internal for reflectance and RT-InGaAs external for transmittance) and a 20-watt high intensity tungsten–halogen NIR light source. Reflectance measurements were performed with a fiber optic probe (type IN 261) which covered the wavelengths between 780–2500 nm, while transmittance measurements were performed between 800 and 1725 nm. Thirty-two scans were acquired for each reflectance spectrum in about 15.32 s while 128 scans were obtained for transmittance in about 62 s. Resolution was 8 cm⁻¹ for both spectral measurement modes. Instrument control was done using OPUS software (Bruker Optik, GmbH, Ettlingen Germany). Classification applications were performed using three classifiers; Backpropagation Neural Networks, ident and cluster classification algorithms. For these classification applications, Neural Network tool box in Matlab, ident and cluster modules in OPUS software were used. Classifications were performed considering different scenarios; two quality conditions at once (good vs bruised, good vs fly defect) and three quality conditions at once (good, bruised and fly defect). Two spectrometer readings were used in classification applications; reflectance and transmittance. Classification results obtained using artificial neural networks algorithm in discriminating good olives from bruised olives, from olives with fly defect and from the olive group including both bruised and fly defected olives with success rates respectively changing between 97 and 99%, 61 and 94% and between 58.67 and 92%. On the other hand, classification results obtained for discriminating good olives from bruised ones and also for discriminating good olives from fly defected olives using the ident method ranged between 75-97.5% and 32.5-57.5%, respectfully; results obtained for the same classification applications using the cluster method ranged between 52.5-97.5% and between 22.5-57.5%.

Keywords: artificial neural networks, statistical classifiers, NIR spectroscopy, reflectance, transmittance

Procedia PDF Downloads 246
11323 Code Embedding for Software Vulnerability Discovery Based on Semantic Information

Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson

Abstract:

Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.

Keywords: code representation, deep learning, source code semantics, vulnerability discovery

Procedia PDF Downloads 159
11322 An Approach to Maximize the Influence Spread in the Social Networks

Authors: Gaye Ibrahima, Mendy Gervais, Seck Diaraf, Ouya Samuel

Abstract:

In this paper, we consider the influence maximization in social networks. Here we give importance to initial diffuser called the seeds. The goal is to find efficiently a subset of k elements in the social network that will begin and maximize the information diffusion process. A new approach which treats the social network before to determine the seeds, is proposed. This treatment eliminates the information feedback toward a considered element as seed by extracting an acyclic spanning social network. At first, we propose two algorithm versions called SCG − algoritm (v1 and v2) (Spanning Connected Graphalgorithm). This algorithm takes as input data a connected social network directed or no. And finally, a generalization of the SCG − algoritm is proposed. It is called SG − algoritm (Spanning Graph-algorithm) and takes as input data any graph. These two algorithms are effective and have each one a polynomial complexity. To show the pertinence of our approach, two seeds set are determined and those given by our approach give a better results. The performances of this approach are very perceptible through the simulation carried out by the R software and the igraph package.

Keywords: acyclic spanning graph, centrality measures, information feedback, influence maximization, social network

Procedia PDF Downloads 249
11321 Ethnobotanical Survey of Medicinal Plants from Bechar Region, South-West of Algeria

Authors: Naima Fatehi

Abstract:

The paper reports on 107 medicinal plants, traditionally used in the South-West of Algeria (Bechar region). The information has been documented by interviewing traditional herbalists, various elderly men and women following different ethnobotanical methods. Ethnobotanical data was arranged alphabetically by botanical name, followed by family name, vernacular name, and part used. The present paper represents significant ethnobotanical information on medical plants used extensively in Bechar region for treating various diseases and provides baseline data for future pharmacological and phytochemical studies.

Keywords: medicinal plants, ethnobotanical survey, South-West Algeria, Bechar region

Procedia PDF Downloads 521
11320 Enhancing the Flotation of Fine and Ultrafine Pyrite Particles Using Electrolytically Generated Bubbles

Authors: Bogale Tadesse, Krutik Parikh, Ndagha Mkandawire, Boris Albijanic, Nimal Subasinghe

Abstract:

It is well established that the floatability and selectivity of mineral particles are highly dependent on the particle size. Generally, a particle size of 10 micron is considered as the critical size below which both flotation selectivity and recovery decline sharply. It is widely accepted that the majority of ultrafine particles, including highly liberated valuable minerals, will be lost in tailings during a conventional flotation process. This is highly undesirable particularly in the processing of finely disseminated complex and refractory ores where there is a requirement for fine grinding in order to liberate the valuable minerals. In addition, the continuing decline in ore grade worldwide necessitates intensive processing of low grade mineral deposits. Recent advances in comminution allow the economic grinding of particles down to 10 micron sizes to enhance the probability of liberating locked minerals from low grade ores. Thus, it is timely that the flotation of fine and ultrafine particles is improved in order to reduce the amount of valuable minerals lost as slimes. It is believed that the use of fine bubbles in flotation increases the bubble-particle collision efficiency and hence the flotation performance. Electroflotation, where bubbles are generated by the electrolytic breakdown of water to produce oxygen and hydrogen gases, leads to the formation of extremely finely dispersed gas bubbles with dimensions varying from 5 to 95 micron. The sizes of bubbles generated by this method are significantly smaller than those found in conventional flotation (> 600 micron). In this study, microbubbles generated by electrolysis of water were injected into a bench top flotation cell to assess the performance electroflotation in enhancing the flotation of fine and ultrafine pyrite particles of sizes ranging from 5 to 53 micron. The design of the cell and the results from optimization of the process variables such as current density, pH, percent solid and particle size will be presented at this conference.

Keywords: electroflotation, fine bubbles, pyrite, ultrafine particles

Procedia PDF Downloads 336
11319 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kumar Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform

Procedia PDF Downloads 115
11318 Audit on Compliance with Ottawa Ankle Rules in Ankle Radiograph Requests

Authors: Daud Muhammad

Abstract:

Introduction: Ankle radiographs are frequently requested in Emergency Departments (ED) for patients presenting with traumatic ankle pain. The Ottawa Ankle Rules (OAR) serve as a clinical guideline to determine the necessity of these radiographs, aiming to reduce unnecessary imaging. This audit was conducted to evaluate the adequacy of clinical information provided in radiograph requests in relation to the OAR. Methods: A retrospective analysis was performed on 50 consecutive ankle radiograph requests under ED clinicians' names for patients aged above 5 years, specifically excluding follow-up radiographs for known fractures. The study assessed whether the provided clinical information met the criteria outlined by the OAR. Results: The audit revealed that none of the 50 radiograph requests contained sufficient information to satisfy the Ottawa Ankle Rules. Furthermore, 10 out of the 50 radiographs (20%) identified fractures. Discussion: The findings indicate a significant lack of adherence to the OAR, suggesting potential overuse of radiography and unnecessary patient exposure to radiation. This non-compliance may also contribute to increased healthcare costs and resource utilization, as well as possible delays in diagnosis and treatment. Recommendations: To address these issues, the following recommendations are proposed: (1) Education and Training: Enhance awareness and training among ED clinicians regarding the OAR. (2) Standardised Request Forms: Implement changes to imaging request forms to mandate relevant information according to the OAR. (3) Scan Vetting: Promote awareness among radiographers to discuss the appropriateness of scan requests with clinicians. (4) Regular re-audits should be conducted to monitor improvements in compliance.

Keywords: Ottawa ankle rules, ankle radiographs, emergency department, traumatic pain

Procedia PDF Downloads 46
11317 A Case Report on Cognitive-Communication Intervention in Traumatic Brain Injury

Authors: Nikitha Francis, Anjana Hoode, Vinitha George, Jayashree S. Bhat

Abstract:

The interaction between cognition and language, referred as cognitive-communication, is very intricate, involving several mental processes such as perception, memory, attention, lexical retrieval, decision making, motor planning, self-monitoring and knowledge. Cognitive-communication disorders are difficulties in communicative competencies that result from underlying cognitive impairments of attention, memory, organization, information processing, problem solving, and executive functions. Traumatic brain injury (TBI) is an acquired, non - progressive condition, resulting in distinct deficits of cognitive communication abilities such as naming, word-finding, self-monitoring, auditory recognition, attention, perception and memory. Cognitive-communication intervention in TBI is individualized, in order to enhance the person’s ability to process and interpret information for better functioning in their family and community life. The present case report illustrates the cognitive-communicative behaviors and the intervention outcomes of an adult with TBI, who was brought to the Department of Audiology and Speech Language Pathology, with cognitive and communicative disturbances, consequent to road traffic accident. On a detailed assessment, she showed naming deficits along with perseverations and had severe difficulty in recalling the details of the accident, her house address, places she had visited earlier, names of people known to her, as well as the activities she did each day, leading to severe breakdowns in her communicative abilities. She had difficulty in initiating, maintaining and following a conversation. She also lacked orientation to time and place. On administration of the Manipal Manual of Cognitive Linguistic Abilities (MMCLA), she exhibited poor performance on tasks related to visual and auditory perception, short term memory, working memory and executive functions. She attended 20 sessions of cognitive-communication intervention which followed a domain-general, adaptive training paradigm, with tasks relevant to everyday cognitive-communication skills. Compensatory strategies such as maintaining a dairy with reminders of her daily routine, names of people, date, time and place was also recommended. MMCLA was re-administered and her performance in the tasks showed significant improvements. Occurrence of perseverations and word retrieval difficulties reduced. She developed interests to initiate her day-to-day activities at home independently, as well as involve herself in conversations with her family members. Though she lacked awareness about her deficits, she actively involved herself in all the therapy activities. Rehabilitation of moderate to severe head injury patients can be done effectively through a holistic cognitive retraining with a focus on different cognitive-linguistic domains. Selection of goals and activities should have relevance to the functional needs of each individual with TBI, as highlighted in the present case report.

Keywords: cognitive-communication, executive functions, memory, traumatic brain injury

Procedia PDF Downloads 347
11316 Some Generalized Multivariate Estimators for Population Mean under Multi Phase Stratified Systematic Sampling

Authors: Muqaddas Javed, Muhammad Hanif

Abstract:

The generalized multivariate ratio and regression type estimators for population mean are suggested under multi-phase stratified systematic sampling (MPSSS) using multi auxiliary information. Estimators are developed under the two different situations of availability of auxiliary information. The expressions of bias and mean square error (MSE) are developed. Special cases of suggested estimators are also discussed and simulation study is conducted to observe the performance of estimators.

Keywords: generalized estimators, multi-phase sampling, stratified random sampling, systematic sampling

Procedia PDF Downloads 730
11315 Characteristics of Himalayan Glaciers with Lakes, Kosi Sub-Basin, Ganga Basin: Based on Remote Sensing and GIS Techniques

Authors: Ram Moorat Singh, Arun Kumar Sharma, Ravi Chaurey

Abstract:

Assessment of characteristics of Himalayan glaciers with or without glacier lakes was carried out for 1937glaciers of Kosi sub-basin, Ganga basin by using remote sensing and GIS techniques. Analysis of IRS-P6 AWiFS Data of 2004-07 periods, SRTM DEM and MODIS Land Surface Temperature (LST) data (15year mean) using image processing and GIS tools has provided significant information on various glacier parameters. The glacier area, length, width, ice exposed area, debris cover area, glacier slope, orientation, elevation and temperature data was analysed. The 119 supra glacier lakes and 62 moraine dam/peri-glacier lakes (area > 0.02 km2) in the study were studied to discern the suitable glacier conditions for glacier lake formation. On analysis it is observed that the glacial lakes are preferably formed in association with large dimension glaciers (area, length and width), glaciers with higher percent ice exposed area, lower percent debris cover area and in general mean elevation value greater than 5300 m amsl. On analysis of lake type shows that the moraine dam lakes are formed associated with glaciers located at relatively higher altitude as compared to altitude of glaciers with supra glacier lakes. Analysis of frequency of occurrence of lakes vis a vis glacier orientation shows that more number of glacier lakes are formed associated with glaciers having orientation south, south east, south west, east and west directions. The supra glacial lakes are formed in association with glaciers having higher mean temperature as compared to moraine dam lakes as verified using LST data of 15 years (2000-2014).

Keywords: remote sensing, supra glacial lake, Himalaya, Kosi sub-basin, glaciers, moraine-dammed lake

Procedia PDF Downloads 380
11314 Use of Alternative and Complementary Therapies in Patients with Chronic Pain in a Medical Institution in Medellin, Colombia, 2014

Authors: Lina María Martínez Sánchez, Juliana Molina Valencia, Esteban Vallejo Agudelo, Daniel Gallego González, María Isabel Pérez Palacio, Juan Ricardo Gaviria García, María De Los Ángeles Rodríguez Gázquez, Gloria Inés Martínez Domínguez

Abstract:

Alternative and complementary therapies constitute a vast and complex combination of interventions, philosophies, approaches, and therapies that acquire a holistic healthcare point of view, becoming an alternative for the treatment of patients with chronic pain. Objective: determine the characteristics of the use of alternative and complementary therapies in patients with chronic pain who consulted in a medical institution. Methodology: cross-sectional and descriptive study, with a population of patients that assisted to the outpatient consultation and met the eligibility criteria. Sampling was not conducted. A form was used for the collection of demographic and clinical variables and the Holistic Complementary and Alternative Medicine Questionnaire (HCAMQ) was validated. The analysis and processing of information was carried out using the SPSS program vr.19. Results: 220 people with chronic pain were included. The average age was 54.7±16.2 years, 78.2% were women, and 75.5% belonged to the socioeconomic strata 1 to 3. Musculoskeletal pain (77.7%), migraine (15%) and neuralgia (9.1%) were the most frequently types of chronic pain. 33.6% of participants have used some kind of alternative and complementary therapy; the most frequent were: homeopathy (14.5%), phytotherapy (12.7%), and acupuncture (11.4%). The total average HCAMQ score for the study group was 30.2±7.0 points, which shows a moderate attitude toward the use of complementary and alternative medicine. The highest scores according to the type of pain were: neuralgia (32.4±5.8), musculoskeletal pain (30.5±6.7), fibromyalgia (29.6±7.3) and migraine (28.5±8.8). The reliability of the HCAMQ was acceptable (Cronbach's α: 0.6). Conclusion: it was noted that the types of chronic pain and the clinical or therapeutic management of patients correspond to the data available in current literature. Despite the moderate attitude toward the use of these alternative and complementary therapies, one of every three patients uses them.

Keywords: chronic pain, complementary therapies, homeopathy, acupuncture analgesia

Procedia PDF Downloads 515
11313 Green Computing: Awareness and Practice in a University Information Technology Department

Authors: Samson Temitope Obafemi

Abstract:

The fact that ICTs is pervasive in today’s society paradoxically also calls for the need for green computing. Green computing generally encompasses the study and practice of using Information and Communication Technology (ICT) resources effectively and efficiently without negatively affecting the environment. Since the emergence of this innovation, manufacturers and governmental bodies such as Energy Star and the United State of America’s government have obviously invested many resources in ensuring the reality of green design, manufacture, and disposal of ICTs. However, the level of adherence to green use of ICTs among users have been less accounted for especially in developing ICT consuming nations. This paper, therefore, focuses on examining the awareness and practice of green computing among academics and students of the Information Technology Department of Durban University of Technology, Durban South Africa, in the context of green use of ICTs. This was achieved through a survey that involved the use of a questionnaire with four sections: (a) demography of respondents, (b) Awareness of green computing, (c) practices of green computing, and (d) attitude towards greener computing. One hundred and fifty (150) questionnaires were distributed, one hundred and twenty (125) were completed and collected for data analysis. Out of the one hundred and twenty-five (125) respondents, twenty-five percent (25%) were academics while the remaining seventy-five percent (75%) were students. The result showed a higher level of awareness of green computing among academics when compared to the students. Green computing practices are also shown to be highly adhered to among academics only. However, interestingly, the students were found to be more enthusiastic towards greener computing in the future. The study, therefore, suggests that the awareness of green computing should be further strengthened among students from the curriculum point of view in order to improve on the greener use of ICTs in universities especially in developing countries.

Keywords: awareness, green computing, green use, information technology

Procedia PDF Downloads 195
11312 Critical Success Factors for Implementation of E-Supply Chain Management

Authors: Mehrnoosh Askarizadeh

Abstract:

Globalization of the economy, e-business, and introduction of new technologies pose new challenges to all organizations. In recent decades, globalization, outsourcing, and information technology have enabled many organizations to successfully operate collaborative supply networks in which each specialized business partner focuses on only a few key strategic activities For this industries supply network can be acknowledged as a new form of organization. We will study about critical success factors (CSFs) for implementation of SCM in companies. It is shown that in different circumstances e- supply chain management has a higher impact on performance.

Keywords: supply chain management, logistics management, critical success factors, information technology, top management support, human resource

Procedia PDF Downloads 409
11311 School Curriculum Incorporating Rights to Live in Clean and Healthy Environment: Assessing Its Effectiveness

Authors: Sitaram Dahal

Abstract:

Among many strategic and practical needs in overcoming the threats and challenges being experienced in the global environment, constitutional provision for Rights to live in clean and healthy environment is one and so is the school curriculum incorporating information on such rights. Government of Nepal has also introduced information on rights to live in clean and healthy environment, as provisioned in its interim constitution of 2007, in the secondary level curriculum of formal education. As the predetermined specific objective of such curriculum is to prepare students who are conscious of citizens’ rights and responsibilities and are able to adopt functions, duties and rights of the rights holders and duty bearers; the study was designed to assess the effectiveness of such curriculum. The study was conducted in one private school and a community school to assess the effectiveness of such curriculum. The study shows that such curriculum has been able to make students responsible duty bearers as they were aware of their habits towards environment. Whereas only very few students are aware enough as being rights holders. Students of community schools were aware rights holders as they complain if they are not satisfied with the environment of the school itself. But private school is far behind in this case. It can be said that only curriculum with very few portion of information on such rights might not be capable enough to meet its objective.

Keywords: curriculum, environmental rights, constitution, effectiveness

Procedia PDF Downloads 327
11310 3D Microscopy, Image Processing, and Analysis of Lymphangiogenesis in Biological Models

Authors: Thomas Louis, Irina Primac, Florent Morfoisse, Tania Durre, Silvia Blacher, Agnes Noel

Abstract:

In vitro and in vivo lymphangiogenesis assays are essential for the identification of potential lymphangiogenic agents and the screening of pharmacological inhibitors. In the present study, we analyse three biological models: in vitro lymphatic endothelial cell spheroids, in vivo ear sponge assay, and in vivo lymph node colonisation by tumour cells. These assays provide suitable 3D models to test pro- and anti-lymphangiogenic factors or drugs. 3D images were acquired by confocal laser scanning and light sheet fluorescence microscopy. Virtual scan microscopy followed by 3D reconstruction by image aligning methods was also used to obtain 3D images of whole large sponge and ganglion samples. 3D reconstruction, image segmentation, skeletonisation, and other image processing algorithms are described. Fixed and time-lapse imaging techniques are used to analyse lymphatic endothelial cell spheroids behaviour. The study of cell spatial distribution in spheroid models enables to detect interactions between cells and to identify invasion hierarchy and guidance patterns. Global measurements such as volume, length, and density of lymphatic vessels are measured in both in vivo models. Branching density and tortuosity evaluation are also proposed to determine structure complexity. Those properties combined with vessel spatial distribution are evaluated in order to determine lymphangiogenesis extent. Lymphatic endothelial cell invasion and lymphangiogenesis were evaluated under various experimental conditions. The comparison of these conditions enables to identify lymphangiogenic agents and to better comprehend their roles in the lymphangiogenesis process. The proposed methodology is validated by its application on the three presented models.

Keywords: 3D image segmentation, 3D image skeletonisation, cell invasion, confocal microscopy, ear sponges, light sheet microscopy, lymph nodes, lymphangiogenesis, spheroids

Procedia PDF Downloads 379
11309 Effectiveness of ATMS (Advanced Transport Management Systems) in Asuncion, Paraguay

Authors: Sung Ho Oh

Abstract:

The advanced traffic lights, the system of traffic information collection and provision, the CCTVs for traffic control, and the traffic information center were installed in Asuncion, capital of Paraguay. After pre-post comparison of the installation, significant changes were found. Even though the traffic volumes were increased, travel speed was higher, so that travel time from origin to destination was decreased. the saving values for travel time, gas cost, and environmental cost are about 47 million US dollars per year. Satisfaction survey results for the installation were presented with statistical significance analysis.

Keywords: advanced transport management systems, effectiveness, Paraguay, traffic lights

Procedia PDF Downloads 354
11308 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: additive manufacturing, lean production, reproducibility, work safety

Procedia PDF Downloads 184
11307 Biomass Waste-To-Energy Technical Feasibility Analysis: A Case Study for Processing of Wood Waste in Malta

Authors: G. A. Asciak, C. Camilleri, A. Rizzo

Abstract:

The waste management in Malta is a national challenge. Coupled with Malta’s recent economic boom, which has seen massive growth in several sectors, especially the construction industry, drastic actions need to be taken. Wood waste, currently being dumped in landfills, is one type of waste which has increased astronomically. This research study aims to carry out a thorough examination on the possibility of using this waste as a biomass resource and adopting a waste-to-energy technology in order to generate electrical energy. This study is composed of three distinct yet interdependent phases, namely, data collection from the local SMEs, thermal analysis using the bomb calorimeter, and generation of energy from wood waste using a micro biomass plant. Data collection from SMEs specializing in wood works was carried out to obtain information regarding the available types of wood waste, the annual weight of imported wood, and to analyse the manner in which wood shavings are used after wood is manufactured. From this analysis, it resulted that five most common types of wood available in Malta which would suitable for generating energy are Oak (hardwood), Beech (hardwood), Red Beech (softwood), African Walnut (softwood) and Iroko (hardwood). Subsequently, based on the information collected, a thermal analysis using a 6200 Isoperibol calorimeter on the five most common types of wood was performed. This analysis was done so as to give a clear indication with regards to the burning potential, which will be valuable when testing the wood in the biomass plant. The experiments carried out in this phase provided a clear indication that the African Walnut generated the highest gross calorific value. This means that this type of wood released the highest amount of heat during the combustion in the calorimeter. This is due to the high presence of extractives and lignin, which accounts for a slightly higher gross calorific value. This is followed by Red Beech and Oak. Moreover, based on the findings of the first phase, both the African Walnut and Red Beech are highly imported in the Maltese Islands for use in various purposes. Oak, which has the third highest gross calorific value is the most imported and common wood used. From the five types of wood, three were chosen for use in the power plant on the basis of their popularity and their heating values. The PP20 biomass plant was used to burn the three types of shavings in order to compare results related to the estimated feedstock consumed by the plant, the high temperatures generated, the time taken by the plant to produce gasification temperatures, and the projected electrical power attributed to each wood type. From the experiments, it emerged that whilst all three types reached the required gasification temperature and thus, are feasible for electrical energy generation. African Walnut was deemed to be the most suitable fast-burning fuel. This is followed by Red-beech and Oak, which required a longer period of time to reach the required gasification temperatures. The results obtained provide a clear indication that wood waste can not only be treated instead of being dumped in dumped in landfill but coupled.

Keywords: biomass, isoperibol calorimeter, waste-to-energy technology, wood

Procedia PDF Downloads 243
11306 Attribute Index and Classification Method of Earthquake Damage Photographs of Engineering Structure

Authors: Ming Lu, Xiaojun Li, Bodi Lu, Juehui Xing

Abstract:

Earthquake damage phenomenon of each large earthquake gives comprehensive and profound real test to the dynamic performance and failure mechanism of different engineering structures. Cognitive engineering structure characteristics through seismic damage phenomenon are often far superior to expensive shaking table experiments. After the earthquake, people will record a variety of different types of engineering damage photos. However, a large number of earthquake damage photographs lack sufficient information and reduce their using value. To improve the research value and the use efficiency of engineering seismic damage photographs, this paper objects to explore and show seismic damage background information, which includes the earthquake magnitude, earthquake intensity, and the damaged structure characteristics. From the research requirement in earthquake engineering field, the authors use the 2008 China Wenchuan M8.0 earthquake photographs, and provide four kinds of attribute indexes and classification, which are seismic information, structure types, earthquake damage parts and disaster causation factors. The final object is to set up an engineering structural seismic damage database based on these four attribute indicators and classification, and eventually build a website providing seismic damage photographs.

Keywords: attribute index, classification method, earthquake damage picture, engineering structure

Procedia PDF Downloads 765
11305 ArcGIS as a Tool for Infrastructure Documentation and Asset Management: Establishing a GIS for Computer Network Documentation

Authors: John Segars

Abstract:

Built out of a real-world need to have better, more detailed, asset and infrastructure documentation, this project will lay out the case for using the database functionality of ArcGIS as a tool to track and maintain infrastructure location, status, maintenance and serviceability. Workflows and processes will be presented and detailed which may be applied to an organizations’ infrastructure needs that might allow them to make use of the robust tools which surround the ArcGIS platform. The end result is a value-added information system framework with a geographic component e.g., the spatial location of various I.T. assets, a detailed set of records which not only documents location but also captures the maintenance history for assets along with photographs and documentation of these various assets as attachments to the numerous feature class items. In addition to the asset location and documentation benefits, the staff will be able to log into the devices and pull SNMP (Simple Network Management Protocol) based query information from within the user interface. The entire collection of information may be displayed in ArcGIS, via a JavaScript based web application or via queries to the back-end database. The project is applicable to all organizations which maintain an IT infrastructure but specifically targets post-secondary educational institutions where access to ESRI resources is generally already available in house.

Keywords: ESRI, GIS, infrastructure, network documentation, PostgreSQL

Procedia PDF Downloads 181
11304 The Role of Nutrition and Food Engineering in Promoting Sustainable Food Systems

Authors: Sara Khan Mohammadi

Abstract:

The world is facing a major challenge of feeding a growing population while ensuring the sustainability of food systems. The United Nations estimates that the global population will reach 9.7 billion by 2050, which means that food production needs to increase by 70% to meet the demand. However, this increase in food production should not come at the cost of environmental degradation, loss of biodiversity, and climate change. Therefore, there is a need for sustainable food systems that can provide healthy and nutritious food while minimizing their impact on the environment. Nutrition and Food Engineering: Nutrition and food engineering play a crucial role in promoting sustainable food system. Nutrition is concerned with the study of nutrients in foods, their absorption, metabolism, and their effects on health. Food engineering involves the application of engineering principles to design, develop, and optimize food processing operations. Together, nutrition and food engineering can help to create sustainable food systems by: 1. Developing Nutritious Foods: Nutritionists and food engineers can work together to develop foods that are rich in nutrients such as vitamins, minerals, fiber, and protein. These foods can be designed to meet the nutritional needs of different populations while minimizing waste. 2. Reducing Food Waste: Food waste is a major problem globally as it contributes to greenhouse gas emissions and wastes resources such as water and land. Nutritionists and food engineers can work together to develop technologies that reduce waste during processing, storage, transportation, and consumption. 3. Improving Food Safety: Unsafe foods can cause illnesses such as diarrhea, cholera, typhoid fever among others which are major public health concerns globally. Nutritionists and food engineers can work together to develop technologies that improve the safety of foods from farm to fork. 4. Enhancing Sustainability: Sustainable agriculture practices such as conservation agriculture can help reduce soil erosion while improving soil fertility. Nutritionists and food engineers can work together to develop technologies that promote sustainable agriculture practices.

Keywords: sustainable food, developing food, reducing food waste, food safety

Procedia PDF Downloads 87
11303 Material Concepts and Processing Methods for Electrical Insulation

Authors: R. Sekula

Abstract:

Epoxy composites are broadly used as an electrical insulation for the high voltage applications since only such materials can fulfill particular mechanical, thermal, and dielectric requirements. However, properties of the final product are strongly dependent on proper manufacturing process with minimized material failures, as too large shrinkage, voids and cracks. Therefore, application of proper materials (epoxy, hardener, and filler) and process parameters (mold temperature, filling time, filling velocity, initial temperature of internal parts, gelation time), as well as design and geometric parameters are essential features for final quality of the produced components. In this paper, an approach for three-dimensional modeling of all molding stages, namely filling, curing and post-curing is presented. The reactive molding simulation tool is based on a commercial CFD package, and include dedicated models describing viscosity and reaction kinetics that have been successfully implemented to simulate the reactive nature of the system with exothermic effect. Also a dedicated simulation procedure for stress and shrinkage calculations, as well as simulation results are presented in the paper. Second part of the paper is dedicated to recent developments on formulations of functional composites for electrical insulation applications, focusing on thermally conductive materials. Concepts based on filler modifications for epoxy electrical composites have been presented, including the results of the obtained properties. Finally, having in mind tough environmental regulations, in addition to current process and design aspects, an approach for product re-design has been presented focusing on replacement of epoxy material with the thermoplastic one. Such “design-for-recycling” method is one of new directions associated with development of new material and processing concepts of electrical products and brings a lot of additional research challenges. For that, one of the successful products has been presented to illustrate the presented methodology.

Keywords: curing, epoxy insulation, numerical simulations, recycling

Procedia PDF Downloads 279
11302 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company

Authors: Rahma Saleh Hussein Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, CMMS

Procedia PDF Downloads 125
11301 Assessment of Environmental Mercury Contamination from an Old Mercury Processing Plant 'Thor Chemicals' in Cato Ridge, KwaZulu-Natal, South Africa

Authors: Yohana Fessehazion

Abstract:

Mercury is a prominent example of a heavy metal contaminant in the environment, and it has been extensively investigated for its potential health risk in humans and other organisms. In South Africa, massive mercury contamination happened in1980s when the England-based mercury reclamation processing plant relocated to Cato Ridge, KwaZulu-Natal Province, and discharged mercury waste into the Mngceweni River. This mercury waste discharge resulted in high mercury concentration that exceeded the acceptable levels in Mngceweni River, Umgeni River, and human hair of the nearby villagers. This environmental issue raised the alarm, and over the years, several environmental assessments were reported the dire environmental crises resulting from the Thor Chemicals (now known as Metallica Chemicals) and urged the immediate removal of the around 3,000 tons of mercury waste stored in the factory storage facility over two decades. Recently theft of some containers with the toxic substance from the Thor Chemicals warehouse and the subsequent fire that ravaged the facility furtherly put the factory on the spot escalating the urgency of left behind deadly mercury waste removal. This project aims to investigate the mercury contamination leaking from an old Thor Chemicals mercury processing plant. The focus will be on sediments, water, terrestrial plants, and aquatic weeds such as the prominent water hyacinth weeds in the nearby water systems of Mngceweni River, Umgeni River, and Inanda Dam as a bio-indicator and phytoremediator for mercury pollution. Samples will be collected in spring around October when the condition is favourable for microbial activity to methylate mercury incorporated in sediments and blooming season for some aquatic weeds, particularly water hyacinth. Samples of soil, sediment, water, terrestrial plant, and aquatic weed will be collected per sample site from the point of source (Thor Chemicals), Mngceweni River, Umgeni River, and the Inanda Dam. One-way analysis of variance (ANOVA) tests will be conducted to determine any significant differences in the Hg concentration among all sampling sites, followed by Least Significant Difference post hoc test to determine if mercury contamination varies with the gradient distance from the source point of pollution. The flow injection atomic spectrometry (FIAS) analysis will also be used to compare the mercury sequestration between the different plant tissues (roots and stems). The principal component analysis is also envisaged for use to determine the relationship between the source of mercury pollution and any of the sampling points (Umgeni and Mngceweni Rivers and the Inanda Dam). All the Hg values will be expressed in µg/L or µg/g in order to compare the result with the previous studies and regulatory standards. Sediments are expected to have relatively higher levels of Hg compared to the soils, and aquatic macrophytes, water hyacinth weeds are expected to accumulate a higher concentration of mercury than terrestrial plants and crops.

Keywords: mercury, phytoremediation, Thor chemicals, water hyacinth

Procedia PDF Downloads 223
11300 Infrastructural Barriers to Engaged Learning in the South Pacific: A Mixed-Methods Study of Cook Islands Nurses' Attitudes towards Health Information Technology

Authors: Jonathan Frank, Michelle Salmona

Abstract:

We conducted quantitative and qualitative analyses of nurses’ perceived ease of use of electronic medical records and telemedicine in the Cook Islands. We examined antecedents of perceived ease of use through the lens of social construction of learning, and cultural diffusion. Our findings confirmed expected linkages between PEOU, attitudes and intentions. Interviews with nurses suggested infrastructural barriers to engaged learning. We discussed managerial implications of our findings, and areas of interest for future research.

Keywords: health information technology, ICT4D, TAM, developing countries

Procedia PDF Downloads 289
11299 The Implementation of Level of Service for Development of Kuala Lumpur Transit Information System using GIS

Authors: Mokhtar Azizi

Abstract:

Due to heavy traffic and congested roads, it is crucial that the most popular main public transport services in Kuala Lumpur i.e. Putra LRT, Star LRT, KTM Commuter, KL Monorail and Rapid Bus must be continuously monitored and improved to fulfill the rider’s requirement and kept updated by the transit agencies. Evaluation on the current status of the services has been determined out by calculating the transit supportive area (TSA) and level of service (LOS) for each transit station. This research study has carried out the TSA and LOS mapping based on GIS techniques. The detailed census data of the region along the line of services has been collected from the Department of Statistics Malaysia for this purpose. The service coverage has been decided by 400 meters buffer zone for bus stations and 800 meters for rails station and railways in measurement the Quality of Service along the line of services. All the required information has been calculated by using the customized GIS software called Kuala Lumpur Transit Information System (KLTIS). The transit supportive area was calculated with the employment density at least 10 job/hectare or household density at 7.5 unit/hectare and total area covered by transit supportive area is 22516 hectare and the total area that is not supported by transit is 1718 hectare in Kuala Lumpur. The level of service is calculated with the percentage of transit supportive area served by transit for each station. In overall the percentage transit supportive areas served by transit for all the stations were less than 50% which falls in a very low level of service category. This research has proven its benefit by providing the current transit services operators with vital information for improvement of existing public transport services.

Keywords: service coverage, transit supportive area, level of service, transit system

Procedia PDF Downloads 376
11298 Foundation Settlement Determination: A Simplified Approach

Authors: Adewoyin O. Olusegun, Emmanuel O. Joshua, Marvel L. Akinyemi

Abstract:

The heterogeneous nature of the subsurface requires the use of factual information to deal with rather than assumptions or generalized equations. Therefore, there is need to determine the actual rate of settlement possible in the soil before structures are built on it. This information will help in determining the type of foundation design and the kind of reinforcement that will be necessary in constructions. This paper presents a simplified and a faster approach for determining foundation settlement in any type of soil using real field data acquired from seismic refraction techniques and cone penetration tests. This approach was also able to determine the depth of settlement of each strata of soil. The results obtained revealed the different settlement time and depth of settlement possible.

Keywords: heterogeneous, settlement, foundation, seismic, technique

Procedia PDF Downloads 445
11297 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 150
11296 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 310