Search results for: field data processing
32034 Limbic Involvement in Visual Processing
Authors: Deborah Zelinsky
Abstract:
The retina filters millions of incoming signals into a smaller amount of exiting optic nerve fibers that travel to different portions of the brain. Most of the signals are for eyesight (called "image-forming" signals). However, there are other faster signals that travel "elsewhere" and are not directly involved with eyesight (called "non-image-forming" signals). This article centers on the neurons of the optic nerve connecting to parts of the limbic system. Eye care providers are currently looking at parvocellular and magnocellular processing pathways without realizing that those are part of an enormous "galaxy" of all the body systems. Lenses are modifying both non-image and image-forming pathways, taking A.M. Skeffington's seminal work one step further. Almost 100 years ago, he described the Where am I (orientation), Where is It (localization), and What is It (identification) pathways. Now, among others, there is a How am I (animation) and a Who am I (inclination, motivation, imagination) pathway. Classic eye testing considers pupils and often assesses posture and motion awareness, but classical prescriptions often overlook limbic involvement in visual processing. The limbic system is composed of the hippocampus, amygdala, hypothalamus, and anterior nuclei of the thalamus. The optic nerve's limbic connections arise from the intrinsically photosensitive retinal ganglion cells (ipRGC) through the "retinohypothalamic tract" (RHT). There are two main hypothalamic nuclei with direct photic inputs. These are the suprachiasmatic nucleus and the paraventricular nucleus. Other hypothalamic nuclei connected with retinal function, including mood regulation, appetite, and glucose regulation, are the supraoptic nucleus and the arcuate nucleus. The retino-hypothalamic tract is often overlooked when we prescribe eyeglasses. Each person is different, but the lenses we choose are influencing this fast processing, which affects each patient's aiming and focusing abilities. These signals arise from the ipRGC cells that were only discovered 20+ years ago and do not address the campana retinal interneurons that were only discovered 2 years ago. As eyecare providers, we are unknowingly altering such factors as lymph flow, glucose metabolism, appetite, and sleep cycles in our patients. It is important to know what we are prescribing as the visual processing evaluations expand past the 20/20 central eyesight.Keywords: neuromodulation, retinal processing, retinohypothalamic tract, limbic system, visual processing
Procedia PDF Downloads 9032033 Data Integrity: Challenges in Health Information Systems in South Africa
Authors: T. Thulare, M. Herselman, A. Botha
Abstract:
Poor system use, including inappropriate design of health information systems, causes difficulties in communication with patients and increased time spent by healthcare professionals in recording the necessary health information for medical records. System features like pop-up reminders, complex menus, and poor user interfaces can make medical records far more time consuming than paper cards as well as affect decision-making processes. Although errors associated with health information and their real and likely effect on the quality of care and patient safety have been documented for many years, more research is needed to measure the occurrence of these errors and determine the causes to implement solutions. Therefore, the purpose of this paper is to identify data integrity challenges in hospital information systems through a scoping review and based on the results provide recommendations on how to manage these. Only 34 papers were found to be most suitable out of 297 publications initially identified in the field. The results indicated that human and computerized systems are the most common challenges associated with data integrity and factors such as policy, environment, health workforce, and lack of awareness attribute to these challenges but if measures are taken the data integrity challenges can be managed.Keywords: data integrity, data integrity challenges, hospital information systems, South Africa
Procedia PDF Downloads 18132032 Large-Capacity Image Information Reduction Based on Single-Cue Saliency Map for Retinal Prosthesis System
Authors: Yili Chen, Xiaokun Liang, Zhicheng Zhang, Yaoqin Xie
Abstract:
In an effort to restore visual perception in retinal diseases, an electronic retinal prosthesis with thousands of electrodes has been developed. The image processing strategies of retinal prosthesis system converts the original images from the camera to the stimulus pattern which can be interpreted by the brain. Practically, the original images are with more high resolution (256x256) than that of the stimulus pattern (such as 25x25), which causes a technical image processing challenge to do large-capacity image information reduction. In this paper, we focus on developing an efficient image processing stimulus pattern extraction algorithm by using a single cue saliency map for extracting salient objects in the image with an optimal trimming threshold. Experimental results showed that the proposed stimulus pattern extraction algorithm performs quite well for different scenes in terms of the stimulus pattern. In the algorithm performance experiment, our proposed SCSPE algorithm have almost five times of the score compared with Boyle’s algorithm. Through experiment s we suggested that when there are salient objects in the scene (such as the blind meet people or talking with people), the trimming threshold should be set around 0.4max, in other situations, the trimming threshold values can be set between 0.2max-0.4max to give the satisfied stimulus pattern.Keywords: retinal prosthesis, image processing, region of interest, saliency map, trimming threshold selection
Procedia PDF Downloads 24832031 Presenting a Model in the Analysis of Supply Chain Management Components by Using Statistical Distribution Functions
Authors: Ramin Rostamkhani, Thurasamy Ramayah
Abstract:
One of the most important topics of today’s industrial organizations is the challenging issue of supply chain management. In this field, scientists and researchers have published numerous practical articles and models, especially in the last decade. In this research, to our best knowledge, the discussion of data modeling of supply chain management components using well-known statistical distribution functions has been considered. The world of science owns mathematics, and showing the behavior of supply chain data based on the characteristics of statistical distribution functions is innovative research that has not been published anywhere until the moment of doing this research. In an analytical process, describing different aspects of functions including probability density, cumulative distribution, reliability, and failure function can reach the suitable statistical distribution function for each of the components of the supply chain management. It can be applied to predict the behavior data of the relevant component in the future. Providing a model to adapt the best statistical distribution function in the supply chain management components will be a big revolution in the field of the behavior of the supply chain management elements in today's industrial organizations. Demonstrating the final results of the proposed model by introducing the process capability indices before and after implementing it alongside verifying the approach through the relevant assessment as an acceptable verification is a final step. The introduced approach can save the required time and cost to achieve the organizational goals. Moreover, it can increase added value in the organization.Keywords: analyzing, process capability indices, statistical distribution functions, supply chain management components
Procedia PDF Downloads 8732030 Fostering Students' Engagement with Historical Issues Surrounding the Field of Graphic Design
Authors: Sara Corvino
Abstract:
The aim of this study is to explore the potential of inclusive learning and assessment strategies to foster students' engagement with historical debates surrounding the field of graphic design. The goal is to respond to the diversity of L4 Graphic Design students, at Nottingham Trent University, in a way that instead of 'lowering standards' can benefit everyone. This research tests, measures, and evaluates the impact of a specific intervention, an assessment task, to develop students' critical visual analysis skills and stimulate a deeper engagement with the subject matter. Within the action research approach, this work has followed a case study research method to understand students' views and perceptions of a specific project. The primary methods of data collection have been: anonymous electronic questionnaire and a paper-based anonymous critical incident questionnaire. NTU College of Business Law and Social Sciences Research Ethics Committee granted the Ethical approval for this research in November 2019. Other methods used to evaluate the impact of this assessment task have been Evasys's report and students' performance. In line with the constructivist paradigm, this study embraces an interpretative and contextualized analysis of the collected data within the triangulation analytical framework. The evaluation of both qualitative and quantitative data demonstrates that active learning strategies and the disruption of thinking patterns can foster greater students' engagement and can lead to meaningful learning.Keywords: active learning, assessment for learning, graphic design, higher education, student engagement
Procedia PDF Downloads 18132029 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 2632028 X-Ray Diffraction, Microstructure, and Mössbauer Studies of Nanostructured Materials Obtained by High-Energy Ball Milling
Authors: N. Boudinar, A. Djekoun, A. Otmani, B. Bouzabata, J. M. Greneche
Abstract:
High-energy ball milling is a solid-state powder processing technique that allows synthesizing a variety of equilibrium and non-equilibrium alloy phases starting from elemental powders. The advantage of this process technology is that the powder can be produced in large quantities and the processing parameters can be easily controlled, thus it is a suitable method for commercial applications. It can also be used to produce amorphous and nanocrystalline materials in commercially relevant amounts and is also amenable to the production of a variety of alloy compositions. Mechanical alloying (high-energy ball milling) provides an inter-dispersion of elements through a repeated cold welding and fracture of free powder particles; the grain size decreases to nano metric scale and the element mix together. Progressively, the concentration gradients disappear and eventually the elements are mixed at the atomic scale. The end products depend on many parameters such as the milling conditions and the thermodynamic properties of the milled system. Here, the mechanical alloying technique has been used to prepare nano crystalline Fe_50 and Fe_64 wt.% Ni alloys from powder mixtures. Scanning electron microscopy (SEM) with energy-dispersive, X-ray analyses and Mössbauer spectroscopy were used to study the mixing at nanometric scale. The Mössbauer Spectroscopy confirmed the ferromagnetic ordering and was use to calculate the distribution of hyperfin field. The Mössbauer spectrum for both alloys shows the existence of a ferromagnetic phase attributed to γ-Fe-Ni solid solution.Keywords: nanocrystalline, mechanical alloying, X-ray diffraction, Mössbauer spectroscopy, phase transformations
Procedia PDF Downloads 43732027 Increasing Efficiency of Own Used Fuel Gas by “LOTION” Method in Generating Systems PT. Pertamina EP Cepu Donggi Matindok Field in Central Sulawesi Province, Indonesia
Authors: Ridwan Kiay Demak, Firmansyahrullah, Muchammad Sibro Mulis, Eko Tri Wasisto, Nixon Poltak Frederic, Agung Putu Andika, Lapo Ajis Kamamu, Muhammad Sobirin, Kornelius Eppang
Abstract:
PC Prove LSM successfully improved the efficiency of Own Used Fuel Gas with the "Lotion" method in the PT Pertamina EP Cepu Donggi Matindok Generating System. The innovation of using the "LOTION" (LOAD PRIORITY SELECTION) method in the generating system is modeling that can provide a priority qualification of main and non-main equipment to keep gas processing running even though it leaves 1 GTG operating. GTG operating system has been integrated, controlled, and monitored properly through PC programs and web-based access to answer Industry 4.0 problems. The results of these improvements have succeeded in making Donggi Matindok Field Production reach 98.77 MMSCFD and become a proper EMAS candidate in 2022-2023. Additional revenue from increasing the efficiency of the use of own used gas amounting to USD USD 5.06 Million per year and reducing operational costs from maintenance efficiency (ABO) due to saving running hours GTG amounted to USD 3.26 Million per year. Continuity of fuel gas availability for the GTG generation system can maintain the operational reliability of the plant, which is 3.833333 MMSCFD. And reduced gas emissions wasted to the environment by 33,810 tons of C02 eq per year.Keywords: LOTION method, load priority selection, fuel gas efficiency, gas turbine generator, reduce emissions
Procedia PDF Downloads 6232026 Advances in Mathematical Sciences: Unveiling the Power of Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid advancements in data collection, storage, and processing capabilities have led to an explosion of data in various domains. In this era of big data, mathematical sciences play a crucial role in uncovering valuable insights and driving informed decision-making through data analytics. The purpose of this abstract is to present the latest advances in mathematical sciences and their application in harnessing the power of data analytics. This abstract highlights the interdisciplinary nature of data analytics, showcasing how mathematics intersects with statistics, computer science, and other related fields to develop cutting-edge methodologies. It explores key mathematical techniques such as optimization, mathematical modeling, network analysis, and computational algorithms that underpin effective data analysis and interpretation. The abstract emphasizes the role of mathematical sciences in addressing real-world challenges across different sectors, including finance, healthcare, engineering, social sciences, and beyond. It showcases how mathematical models and statistical methods extract meaningful insights from complex datasets, facilitating evidence-based decision-making and driving innovation. Furthermore, the abstract emphasizes the importance of collaboration and knowledge exchange among researchers, practitioners, and industry professionals. It recognizes the value of interdisciplinary collaborations and the need to bridge the gap between academia and industry to ensure the practical application of mathematical advancements in data analytics. The abstract highlights the significance of ongoing research in mathematical sciences and its impact on data analytics. It emphasizes the need for continued exploration and innovation in mathematical methodologies to tackle emerging challenges in the era of big data and digital transformation. In summary, this abstract sheds light on the advances in mathematical sciences and their pivotal role in unveiling the power of data analytics. It calls for interdisciplinary collaboration, knowledge exchange, and ongoing research to further unlock the potential of mathematical methodologies in addressing complex problems and driving data-driven decision-making in various domains.Keywords: mathematical sciences, data analytics, advances, unveiling
Procedia PDF Downloads 9432025 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots
Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar
Abstract:
Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.Keywords: agricultural mobile robot, image processing, path recognition, hough transform
Procedia PDF Downloads 14732024 Injury Pattern of Field Hockey Players at Different Field Position during Game and Practice
Authors: Sujay Bisht
Abstract:
The purpose of the study was to assess and examines the pattern of injury among the field hockey players at different field position during practice & game. It was hypothesized that the backfield might have the height rate of injury, followed by midfield. Methods: university level and national level male field hockey (N=60) are selected as a subject and requested to respond an anon questionnaire. Personal characteristics of each and individual players were also collected like (age, height, weight); field hockey professional information (level of play, year of experience, playing surface); players injury history (site, types, cause etc). The rates of injury per athlete per year were also calculated. Result: Around half of the injury occurred were to the lower limbs (49%) followed by head and face (30%), upper limbs (19%) and torso region (2%). Injuries included concussion, wounds, broken nose, ligament sprain, dislocation, fracture, and muscles strain and knee injury. The ligament sprain is the highest rate (40%) among the other types of injuries. After investigation and evaluation backfield players had the highest rate of risk of injury (1.10 injury/athletes-year) followed by midfield players (0.70 injury/athlete-year), forward players (0.45 injury/athlete-year) & goalkeeper was (0.37 injury/athlete-year). Conclusion: Due to the different field position the pattern & rate of injury were different. After evaluation, lower limbs had the highest rate of injury followed by head and face, upper limbs and torso respectively. It also revealed that not only there is a difference in the rate of injury between playing the position, but also in the types of injury sustain at a different position.Keywords: trauma, sprain, strain, astroturf, acute injury
Procedia PDF Downloads 22532023 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 6732022 Analysis of Big Data
Authors: Sandeep Sharma, Sarabjit Singh
Abstract:
As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.Keywords: big data, unstructured data, volume, variety, velocity
Procedia PDF Downloads 54832021 Results of the Field-and-Scientific Study in the Water Area of the Estuaries of the Major Rivers of the Black Sea and Sea Ports on the Territory of Georgia
Authors: Ana Gavardashvili
Abstract:
The field-and-scientific studies to evaluate the modern ecological state in the water area of the estuaries of the major water-abundant rivers in the coastal line of the Black Sea (Chorokhi, Kintrishi, Natanebi, Supsa, Khobistskali, Rioni and Enguri) and sea ports (Batumi, Poti) and sea terminals of the oil pipeline (Baku-Tbilisi-Supsa, Kulevi) were accomplished in the months of June and July of 2015. GPS coordinates and GIS programs were used to fix the areas of the estuaries of the above-listed rivers on a digital map, with their values varying within the limits of 0,861 and 20,390 km2. Water samples from the Black Sea were taken from the river estuaries and sea ports during the field works, with their statistical series of 125 points. The temperatures of air (t2) and water in the Black Sea (t1) were measured locally, and their relative value is (t1 /t2 ) = 0,69 – 0,92. 125 water samples taken from the study object in the Black Sea coastal line were subject to laboratory analysis, and it was established that the Black Sea acidity (pH) changes within the limits of 7,71 – 8,22 in the river estuaries and within 8,42 - 8,65 in the port water areas and at oil terminals. As for the Sea water salinity index (TDS), it changes within the limits of 6,15 – 12,67 in the river estuaries, and (TDS) = 11,80 – 13,67 in the port water areas and at oil terminals. By taking the gained data and climatic changes into account, by using the theories of reliability and risk at the following stage, the nature of the changes of the function of the Black Sea ecological parameters will be established.Keywords: acidity, estuary, salinity, sea
Procedia PDF Downloads 28832020 Field Trips inside Digital Game Environments
Authors: Amani Alsaqqaf, Frederick W. B. Li
Abstract:
Field trips are essential methods of learning in different subjects, and in recent times, there has been a reduction in the number of field trips (FTs) across all learning levels around the world. Virtual field trips (VFTs) in game environments provide FT experience based on the experiential learning theory (ELT). A conceptual framework for designing virtual field trip games (VFTGs) is developed with an aim to support game designers and educators to produce an effective FT experience where technology would enhance education. The conceptual framework quantifies ELT as an internal economy to link learning elements to game mechanics such as feedback loops which leads to facilitating VFTGs design and implementation. This study assesses the conceptual framework for designing VFTGs by investigating the possibility of applying immersive VFTGs in a secondary classroom and compare them with traditional learning that uses video clips and PowerPoint slides from the viewpoint of students’ perceived motivation, presence, and learning. The assessment is achieved by evaluating the learning performance and learner experience of a prototype VFT game, Island of Volcanoes. A quasi-experiment was conducted with 60 secondary school students. The findings of this study are that the VFTG enhanced learning performance to a better level than did the traditional way of learning, and in addition, it provided motivation and a general feeling of presence in the VFTG environment.Keywords: conceptual framework, game-based learning, game design, virtual field trip game
Procedia PDF Downloads 23632019 A Study of Cavity Quantum States Induced by Cavity-Matter Coupling Using Negativity in the Wigner Distribution
Authors: Anneswa Paul, Upendra Harbola
Abstract:
Interaction between light and matter is the primary tool to probe matter at the microscopic level. In recent years, light-matter interaction in optical cavity has found interesting applications in manipulating chemical reactions and material properties by modifying matter states in the cavity. However, not much attention has been given to study modifications in the cavity-field states, which is the focus of study in this work. The classical to non-classical transition in the field state due to interaction with the matter inside the cavity is discussed. The effect of the initial state of the matter on the cavity states as well as the role of photon-fluctuations are explored by considering different initial states of the matter and the field. The results demonstrate that the initial states of the field and the matter play a significant role in generating non-classicality in the cavity-field state as quantified in terms of negativity in the (Wigner) phase-space distribution of the cavity. It is found that the coherences induced between different photon-number states due to the interaction always contribute to enhance the non-classicality, while populations may suppress or enhance it depending on the relative weight of the vacuum state over other states. An increased weight of the vacuum state diminishes the non-classicality. It is shown that the energy exchange takes place between different photon-number states in the cavity field while matter acts as the facilitating agent.Keywords: cavity QED, light-matter interaction, phase space methods, quantum optics
Procedia PDF Downloads 432018 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil
Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman
Abstract:
The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation
Procedia PDF Downloads 26032017 Comparison between Simulation and Experimentally Observed Interactions between Two Different Sized Magnetic Beads in a Fluidic System
Authors: Olayinka Oduwole, Steve Sheard
Abstract:
The magnetic separation of biological cells using super-magnetic beads has been used widely for various bioassays. These bioassays can further be integrated with other laboratory components to form a biosensor which can be used for cell sorting, mixing, purification, transport, manipulation etc. These bio-sensing applications have also been facilitated by the wide availability of magnetic beads which range in size and magnetic properties produced by different manufacturers. In order to improve the efficiency and separation capabilities of these biosensors, it is important to determine the magnetic force induced velocities and interaction of beads within the magnetic field; this will help biosensor users choose the desired magnetic bead for their specific application. This study presents for the first time the interaction between a pair of different sized super-paramagnetic beads suspended in a static fluid moving within a uniform magnetic field using a modified finite-time-finite-difference scheme. A captured video was used to record the trajectory pattern and a good agreement was obtained between the simulated trajectories and the video data. The model is, therefore, a good approximation for predicting the velocities as well as the interaction between various magnetic particles which differ in size and magnetic properties for bio-sensing applications requiring a low concentration of magnetic beads.Keywords: biosensor, magnetic field, magnetic separation, super-paramagnetic bead
Procedia PDF Downloads 47332016 Selection of Pichia kudriavzevii Strain for the Production of Single-Cell Protein from Cassava Processing Waste
Authors: Phakamas Rachamontree, Theerawut Phusantisampan, Natthakorn Woravutthikul, Peerapong Pornwongthong, Malinee Sriariyanun
Abstract:
A total of 115 yeast strains isolated from local cassava processing wastes were measured for crude protein content. Among these strains, the strain MSY-2 possessed the highest protein concentration (>3.5 mg protein/mL). By using molecular identification tools, it was identified to be a strain of Pichia kudriavzevii based on similarity of D1/D2 domain of 26S rDNA region. In this study, to optimize the protein production by MSY-2 strain, Response Surface Methodology (RSM) was applied. The tested parameters were the carbon content, nitrogen content, and incubation time. Here, the value of regression coefficient (R2) = 0.7194 could be explained by the model, which is high to support the significance of the model. Under the optimal condition, the protein content was produced up to 3.77 g per L of the culture and MSY-2 strain contain 66.8 g protein per 100 g of cell dry weight. These results revealed the plausibility of applying the novel strain of yeast in single-cell protein production.Keywords: single cell protein, response surface methodology, yeast, cassava processing waste
Procedia PDF Downloads 40632015 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack
Authors: Lucas Bublitz, Michael Herdrich
Abstract:
By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach
Procedia PDF Downloads 7732014 Machine Learning-Based Workflow for the Analysis of Project Portfolio
Authors: Jean Marie Tshimula, Atsushi Togashi
Abstract:
We develop a data-science approach for providing an interactive visualization and predictive models to find insights into the projects' historical data in order for stakeholders understand some unseen opportunities in the African market that might escape them behind the online project portfolio of the African Development Bank. This machine learning-based web application identifies the market trend of the fastest growing economies across the continent as well skyrocketing sectors which have a significant impact on the future of business in Africa. Owing to this, the approach is tailored to predict where the investment needs are the most required. Moreover, we create a corpus that includes the descriptions of over more than 1,200 projects that approximately cover 14 sectors designed for some of 53 African countries. Then, we sift out this large amount of semi-structured data for extracting tiny details susceptible to contain some directions to follow. In the light of the foregoing, we have applied the combination of Latent Dirichlet Allocation and Random Forests at the level of the analysis module of our methodology to highlight the most relevant topics that investors may focus on for investing in Africa.Keywords: machine learning, topic modeling, natural language processing, big data
Procedia PDF Downloads 16832013 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19632012 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data
Authors: Chico Horacio Jose Sambo
Abstract:
Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.Keywords: neural network, permeability, multilayer perceptron, well log
Procedia PDF Downloads 40532011 Neural Correlates of Diminished Humor Comprehension in Schizophrenia: A Functional Magnetic Resonance Imaging Study
Authors: Przemysław Adamczyk, Mirosław Wyczesany, Aleksandra Domagalik, Artur Daren, Kamil Cepuch, Piotr Błądziński, Tadeusz Marek, Andrzej Cechnicki
Abstract:
The present study aimed at evaluation of neural correlates of humor comprehension impairments observed in schizophrenia. To investigate the nature of this deficit in schizophrenia and to localize cortical areas involved in humor processing we used functional magnetic resonance imaging (fMRI). The study included chronic schizophrenia outpatients (SCH; n=20), and sex, age and education level matched healthy controls (n=20). The task consisted of 60 stories (setup) of which 20 had funny, 20 nonsensical and 20 neutral (not funny) punchlines. After the punchlines were presented, the participants were asked to indicate whether the story was comprehensible (yes/no) and how funny it was (1-9 Likert-type scale). fMRI was performed on a 3T scanner (Magnetom Skyra, Siemens) using 32-channel head coil. Three contrasts in accordance with the three stages of humor processing were analyzed in both groups: abstract vs neutral stories - incongruity detection; funny vs abstract - incongruity resolution; funny vs neutral - elaboration. Additionally, parametric modulation analysis was performed using both subjective ratings separately in order to further differentiate the areas involved in incongruity resolution processing. Statistical analysis for behavioral data used U Mann-Whitney test and Bonferroni’s correction, fMRI data analysis utilized whole-brain voxel-wise t-tests with 10-voxel extent threshold and with Family Wise Error (FWE) correction at alpha = 0.05, or uncorrected at alpha = 0.001. Between group comparisons revealed that the SCH subjects had attenuated activation in: the right superior temporal gyrus in case of irresolvable incongruity processing of nonsensical puns (nonsensical > neutral); the left medial frontal gyrus in case of incongruity resolution processing of funny puns (funny > nonsensical) and the interhemispheric ACC in case of elaboration of funny puns (funny > neutral). Additionally, the SCH group revealed weaker activation during funniness ratings in the left ventro-medial prefrontal cortex, the medial frontal gyrus, the angular and the supramarginal gyrus, and the right temporal pole. In comprehension ratings the SCH group showed suppressed activity in the left superior and medial frontal gyri. Interestingly, these differences were accompanied by protraction of time in both types of rating responses in the SCH group, a lower level of comprehension for funny punchlines and a higher funniness for absurd punchlines. Presented results indicate that, in comparison to healthy controls, schizophrenia is characterized by difficulties in humor processing revealed by longer reaction times, impairments of understanding jokes and finding nonsensical punchlines more funny. This is accompanied by attenuated brain activations, especially in the left fronto-parietal and the right temporal cortices. Disturbances of the humor processing seem to be impaired at the all three stages of the humor comprehension process, from incongruity detection, through its resolution to elaboration. The neural correlates revealed diminished neural activity of the schizophrenia brain, as compared with the control group. The study was supported by the National Science Centre, Poland (grant no 2014/13/B/HS6/03091).Keywords: communication skills, functional magnetic resonance imaging, humor, schizophrenia
Procedia PDF Downloads 21432010 Evaluation of Bucket Utility Truck In-Use Driving Performance and Electrified Power Take-Off Operation
Authors: Robert Prohaska, Arnaud Konan, Kenneth Kelly, Adam Ragatz, Adam Duran
Abstract:
In an effort to evaluate the in-use performance of electrified Power Take-off (PTO) usage on bucket utility trucks operating under real-world conditions, data from 20 medium- and heavy-duty vehicles operating in California, USA were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team. In this paper, duty-cycle statistical analyses of class 5, medium-duty quick response trucks and class 8, heavy-duty material handler trucks are performed to examine and characterize vehicle dynamics trends and relationships based on collected in-use field data. With more than 100,000 kilometers of driving data collected over 880+ operating days, researchers have developed a robust methodology for identifying PTO operation from in-field vehicle data. Researchers apply this unique methodology to evaluate the performance and utilization of the conventional and electric PTO systems. Researchers also created custom representative drive-cycles for each vehicle configuration and performed modeling and simulation activities to evaluate the potential fuel and emissions savings for hybridization of the tractive driveline on these vehicles. The results of these analyses statistically and objectively define the vehicle dynamic and kinematic requirements for each vehicle configuration as well as show the potential for further system optimization through driveline hybridization. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relates specifically to medium- and heavy-duty utility vehicles operating under real-world conditions.Keywords: drive cycle, heavy-duty (HD), hybrid, medium-duty (MD), PTO, utility
Procedia PDF Downloads 39932009 Agents and Causers in the Experiencer-Verb Lexicon
Authors: Margaret Ryan, Linda Cupples, Lyndsey Nickels, Paul Sowman
Abstract:
The current investigation explored the thematic roles of the nouns specified in the lexical entries of experiencer verbs. While prior experimental research assumes experiencer and theme roles for both subject-experiencer (SE) and object-experiencer (OE) verbs, syntactic theorists have posited additional agent and causer roles. Experiment 1 provided evidence for an agent as participants assigned a high degree of intentionality to the logical subject of a subset of SE and OE actives and passives. Experiment 2 provided evidence for a causer as participants assigned high levels of causality to the logical subjects of experiencer sentences generally. However, the presence of an agent, but not a causer, coincided with processing ease. Causality may be an aspect rather than a thematic role. The varying thematic roles amongst experiencer-verb sentences have important implications for stimulus selection because we cannot presume processing is similar across differing sentence subtypes.Keywords: sentence comprehension, lexicon, canonicity, processing, thematic roles, syntax
Procedia PDF Downloads 12432008 Classification of Myoelectric Signals Using Multilayer Perceptron Neural Network with Back-Propagation Algorithm in a Wireless Surface Myoelectric Prosthesis of the Upper-Limb
Authors: Kevin D. Manalo, Jumelyn L. Torres, Noel B. Linsangan
Abstract:
This paper focuses on a wireless myoelectric prosthesis of the upper-limb that uses a Multilayer Perceptron Neural network with back propagation. The algorithm is widely used in pattern recognition. The network can be used to train signals and be able to use it in performing a function on their own based on sample inputs. The paper makes use of the Neural Network in classifying the electromyography signal that is produced by the muscle in the amputee’s skin surface. The gathered data will be passed on through the Classification Stage wirelessly through Zigbee Technology. The signal will be classified and trained to be used in performing the arm positions in the prosthesis. Through programming using Verilog and using a Field Programmable Gate Array (FPGA) with Zigbee, the EMG signals will be acquired and will be used for classification. The classified signal is used to produce the corresponding Hand Movements (Open, Pick, Hold, and Grip) through the Zigbee controller. The data will then be processed through the MLP Neural Network using MATLAB which then be used for the surface myoelectric prosthesis. Z-test will be used to display the output acquired from using the neural network.Keywords: field programmable gate array, multilayer perceptron neural network, verilog, zigbee
Procedia PDF Downloads 38932007 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 57332006 Human Intraocular Thermal Field in Action with Different Boundary Conditions Considering Aqueous Humor and Vitreous Humor Fluid Flow
Authors: Dara Singh, Keikhosrow Firouzbakhsh, Mohammad Taghi Ahmadian
Abstract:
In this study, a validated 3D finite volume model of human eye is developed to study the fluid flow and heat transfer in the human eye at steady state conditions. For this purpose, discretized bio-heat transfer equation coupled with Boussinesq equation is analyzed with different anatomical, environmental, and physiological conditions. It is demonstrated that the fluid circulation is formed as a result of thermal gradients in various regions of eye. It is also shown that posterior region of the human eye is less affected by the ambient conditions compared to the anterior segment which is sensitive to the ambient conditions and also to the way the gravitational field is defined compared to the geometry of the eye making the circulations and the thermal field complicated in transient states. The effect of variation in material and boundary conditions guides us to the conclusion that thermal field of a healthy and non-healthy eye can be distinguished via computer simulations.Keywords: bio-heat, boussinesq, conduction, convection, eye
Procedia PDF Downloads 34532005 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 443