Search results for: air data system
35038 Development and Characterization of Multiphase Hydrogel Systems for Wound Healing
Authors: Rajendra Jangde, Deependra Singh
Abstract:
Present work was based with objective to release of the antimicrobial and debriding agent in sustained manner at the wound surface. In order to provide a long-lasting antimicrobial action and moist environment on wound space, Biocompatible moist system was developed for complete healing. In the present study, a biocompatible moist system of PVA-gelatin hydrogel was developed capable of carrying multiple drugs- Quercetin and Cabopol in controlled manner for effective and complete wound healing. Carbopol and Quercetin were prepared by thin film hydration techniques and optimized system was incorporated in PVA-Gelatin slurry. PVA-Gelatin hydrogels were prepared by freeze thaw method. The prepared dispersion was casted into films to prepare multiphase hydrogel system and characterized by in vitro and in vivo studies. Results revealed the uniform dispersion of microspheres in a three-dimensional matrix of the PVA-Gelatin hydrogel observed at different magnifications. The in vitro release data showed typical biphasic release pattern, i.e., a burst release followed by a slower sustained release for 5 days. Prepared system was found to be stable under both normal and accelerated conditions. Histopathological study showed significant (p<0.05) increase in fibroblast cells, collagen fibres and blood vessels formation. All parameters such as wound contraction, tensile strength, histopathological and biochemical parameters- hydroxyproline content, protein level, etc. were observed significant (p<0.05) in comparison to control group. Present results suggest an accelerated re-epithelialization under moist wound environment with delivery of multiple drugs effective at different stages of wound healing cascade with minimum disturbance of wound bed.Keywords: multiphase hydrogel, optimization quercetin, wound healing
Procedia PDF Downloads 24035037 Magnetic Cellulase/Halloysite Nanotubes as Biocatalytic System for Converting Agro-Waste into Value-Added Product
Authors: Devendra Sillu, Shekhar Agnihotri
Abstract:
The 'nano-biocatalyst' utilizes an ordered assembling of enzyme on to nanomaterial carriers to catalyze desirable biochemical kinetics and substrate selectivity. The current study describes an inter-disciplinary approach for converting agriculture waste, sugarcane bagasse into D-glucose exploiting halloysite nanotubes (HNTs) decorated cellulase enzyme as nano-biocatalytic system. Cellulase was successfully immobilized on HNTs employing polydopamine as an eco-friendly crosslinker while iron oxide nanoparticles were attached to facilitate magnetic recovery of material. The characterization studies (UV-Vis, TEM, SEM, and XRD) displayed the characteristic features of both cellulase and magnetic HNTs in the resulting nanocomposite. Various factors (i.e., working pH, temp., crosslinker conc., enzyme conc.) which may influence the activity of biocatalytic system were investigated. The experimental design was performed using Response Surface Methodology (RSM) for process optimization. Analyses data demonstrated that the nanobiocatalysts retained 80.30% activity even at elevated temperature (55°C) and excellent storage stabilities after 10 days. The repeated usage of system revealed a remarkable consistent relative activity over several cycles. The immobilized cellulase was employed to decompose agro-waste and the maximum decomposition rate of 67.2 % was achieved. Conclusively, magnetic HNTs can serve as a potential support for enzyme immobilization with long term usage, good efficacy, reusability and easy recovery from solution.Keywords: halloysite nanotubes, enzyme immobilization, cellulase, response surface methodology, magnetic recovery
Procedia PDF Downloads 13335036 IntelliCane: A Cane System for Individuals with Lower-Limb Mobility and Functional Impairments
Authors: Adrian Bostan, Nicolae Tapus, Adriana Tapus
Abstract:
The purpose of this research paper is to study and develop a system that is able to help identify problems and improve human rehabilitation after traumatic injuries. Traumatic injuries in human’s lower limbs can occur over a life time and can have serious side effects if they are not treated correctly. In this paper, we developed an intelligent cane (IntelliCane) so as to help individuals in their rehabilitation process and provide feedback to the users. The first stage of the paper involves an analysis of the existing systems on the market and what can be improved. The second stage presents the design of the system. The third part, which is still under development is the validation of the system in real world setups with people in need. This paper presents mainly stages one and two.Keywords: IntelliCane, 3D printing, microprocessor, weight measurement, rehabilitation tool
Procedia PDF Downloads 24435035 Application of Bim Model Data to Estimate ROI for Robots and Automation in Construction Projects
Authors: Brian Romansky
Abstract:
There are many practical, commercially available robots and semi-autonomous systems that are currently available for use in a wide variety of construction tasks. Adoption of these technologies has the potential to reduce the time and cost to deliver a project, reduce variability and risk in delivery time, increase quality, and improve safety on the job site. These benefits come with a cost for equipment rental or contract fees, access to specialists to configure the system, and time needed for set-up and support of the machines while in use. Calculation of the net ROI (Return on Investment) requires detailed information about the geometry of the site, the volume of work to be done, the overall project schedule, as well as data on the capabilities and past performance of available robotic systems. Assembling the required data and comparing the ROI for several options is complex and tedious. Many project managers will only consider the use of a robot in targeted applications where the benefits are obvious, resulting in low levels of adoption of automation in the construction industry. This work demonstrates how data already resident in many BIM (Building Information Model) projects can be used to automate ROI estimation for a sample set of commercially available construction robots. Calculations account for set-up and operating time along with scheduling support tasks required while the automated technology is in use. Configuration parameters allow for prioritization of time, cost, or safety as the primary benefit of the technology. A path toward integration and use of automatic ROI calculation with a database of available robots in a BIM platform is described.Keywords: automation, BIM, robot, ROI.
Procedia PDF Downloads 8735034 A Model for Adaptive Online Quiz: QCitra
Authors: Rosilah Hassan, Karam Dhafer Mayoof, Norngainy Mohd Tawil, Shamshubaridah Ramlee
Abstract:
Application of adaptive online quiz system and a design are performed in this paper. The purpose of adaptive quiz system is to establish different questions automatically for each student and measure their competence on a definite area of discipline. This model determines students competencies in cases like distant-learning which experience challenges frequently. Questions are specialized to allow clear deductions about student gains; they are able to identify student competencies more effectively. Also, negative effects of questions requiring higher knowledge than competency over student’s morale and self-confidence are dismissed. The advantage of the system in the quiz management requires less total time for measuring and is more flexible. Self sufficiency of the system in terms of repeating, planning and assessment of the measurement process allows itself to be used in the individual education sets. Adaptive quiz technique prevents students from distraction and motivation loss, which is led by the questions with quite lower hardness level than student’s competency.Keywords: e-learning, adaptive system, security, quiz database
Procedia PDF Downloads 45035033 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4
Authors: Jae Won Shin
Abstract:
We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction
Procedia PDF Downloads 27535032 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams
Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem
Abstract:
In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data
Procedia PDF Downloads 16135031 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea
Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi
Abstract:
Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow
Procedia PDF Downloads 12335030 Simulating Elevated Rapid Transit System for Performance Analysis
Authors: Ran Etgar, Yuval Cohen, Erel Avineri
Abstract:
One of the major challenges of transportation in medium sized inner-cities (such as Tel-Aviv) is the last-mile solution. Personal rapid transit (PRT) seems like an applicable candidate for this, as it combines the benefits of personal (car) travel with the operational benefits of transit. However, the investment required for large area PRT grid is significant and there is a need to economically justify such investment by correctly evaluating the grid capacity. PRT main elements are small automated vehicles (sometimes referred to as podcars) operating on a network of specially built guideways. The research is looking at a specific concept of elevated PRT system. Literature review has revealed the drawbacks PRT modelling and simulation approaches, mainly due to the lack of consideration of technical and operational features of the system (such as headways, acceleration, safety issues); the detailed design of infrastructure (guideways, stations, and docks); the stochastic and sessional characteristics of demand; and safety regulations – all of them have a strong effect on the system performance. A highly detailed model of the system, developed in this research, is applying a discrete event simulation combined with an agent-based approach, to represent the system elements and the podecars movement logic. Applying a case study approach, the simulation model is used to study the capacity of the system, the expected throughput of the system, the utilization, and the level of service (journey time, waiting time, etc.).Keywords: capacity, productivity measurement, PRT, simulation, transportation
Procedia PDF Downloads 16635029 An Automatic Bayesian Classification System for File Format Selection
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.Keywords: data mining, digital libraries, digital preservation, file format
Procedia PDF Downloads 49935028 Design of UV Based Unicycle Robot to Disinfect Germs and Communicate With Multi-Robot System
Authors: Charles Koduru, Parth Patel, M. Hassan Tanveer
Abstract:
In this paper, the communication between a team of robots is used to sanitize an environment with germs is proposed. We introduce capabilities from a team of robots (most likely heterogeneous), a wheeled robot named ROSbot 2.0 that consists of a mounted LiDAR and Kinect sensor, and a modified prototype design of a unicycle-drive Roomba robot called the UV robot. The UV robot consists of ultrasonic sensors to avoid obstacles and is equipped with an ultraviolet light system to disinfect and kill germs, such as bacteria and viruses. In addition, the UV robot is equipped with disinfectant spray to target hidden objects that ultraviolet light is unable to reach. Using the sensors from the ROSbot 2.0, the robot will create a 3-D model of the environment which will be used to factor how the ultraviolet robot will disinfect the environment. Together this proposed system is known as the RME assistive robot device or RME system, which communicates between a navigation robot and a germ disinfecting robot operated by a user. The RME system includes a human-machine interface that allows the user to control certain features of each robot in the RME assistive robot device. This method allows the cleaning process to be done at a more rapid and efficient pace as the UV robot disinfects areas just by moving around in the environment while using the ultraviolet light system to kills germs. The RME system can be used in many applications including, public offices, stores, airports, hospitals, and schools. The RME system will be beneficial even after the COVID-19 pandemic. The Kennesaw State University will continue the research in the field of robotics, engineering, and technology and play its role to serve humanity.Keywords: multi robot system, assistive robots, COVID-19 pandemic, ultraviolent technology
Procedia PDF Downloads 18635027 Solid Waste Disposal Site Selection in Thiruvananthapuram Corporation Area by Data Analysis Using GIS and Remote Sensing Tools
Authors: C. Asha Poorna, P. G. Vinod, A. R. R. Menon
Abstract:
Currently increasing population and their activities like urbanization and industrialization generating the greatest environmental, issue called Waste. And the major problem in waste management is selection of an appropriate site for waste disposal. The selection of suitable site have constrains like environmental, economical and political considerations. In this paper we discuss the strategies to be followed while selecting a site for decentralized system for solid waste disposal, using Geographic Information System (GIS), the Analytical Hierarchy Process (AHP) and the remote sensing method for Thiruvananthapuram corporation area. It is located on the west coast of India near the extreme south of the mainland. It lies on the shores of Killiyar and Karamana River. Being on the basin the waste managements must be regulated with the water body. The different criteria considered for waste disposal site selection are lithology, surface water, aquifer, groundwater, land use, contours, aspect, elevation, slope, and distance to road, distance from settlement are examined in relation to land fill site selection. Each criterion was identified and weighted by AHP score and mapped using GIS technique and suitable map is prepared by overlay analysis.Keywords: waste disposal, solid waste management, Geographic Information System (GIS), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 39735026 Real-Time Recognition of Dynamic Hand Postures on a Neuromorphic System
Authors: Qian Liu, Steve Furber
Abstract:
To explore how the brain may recognize objects in its general,accurate and energy-efficient manner, this paper proposes the use of a neuromorphic hardware system formed from a Dynamic Video Sensor~(DVS) silicon retina in concert with the SpiNNaker real-time Spiking Neural Network~(SNN) simulator. As a first step in the exploration on this platform a recognition system for dynamic hand postures is developed, enabling the study of the methods used in the visual pathways of the brain. Inspired by the behaviours of the primary visual cortex, Convolutional Neural Networks (CNNs) are modeled using both linear perceptrons and spiking Leaky Integrate-and-Fire (LIF) neurons. In this study's largest configuration using these approaches, a network of 74,210 neurons and 15,216,512 synapses is created and operated in real-time using 290 SpiNNaker processor cores in parallel and with 93.0% accuracy. A smaller network using only 1/10th of the resources is also created, again operating in real-time, and it is able to recognize the postures with an accuracy of around 86.4% -only 6.6% lower than the much larger system. The recognition rate of the smaller network developed on this neuromorphic system is sufficient for a successful hand posture recognition system, and demonstrates a much-improved cost to performance trade-off in its approach.Keywords: spiking neural network (SNN), convolutional neural network (CNN), posture recognition, neuromorphic system
Procedia PDF Downloads 47235025 Data Privacy: Stakeholders’ Conflicts in Medical Internet of Things
Authors: Benny Sand, Yotam Lurie, Shlomo Mark
Abstract:
Medical Internet of Things (MIoT), AI, and data privacy are linked forever in a gordian knot. This paper explores the conflicts of interests between the stakeholders regarding data privacy in the MIoT arena. While patients are at home during healthcare hospitalization, MIoT can play a significant role in improving the health of large parts of the population by providing medical teams with tools for collecting data, monitoring patients’ health parameters, and even enabling remote treatment. While the amount of data handled by MIoT devices grows exponentially, different stakeholders have conflicting understandings and concerns regarding this data. The findings of the research indicate that medical teams are not concerned by the violation of data privacy rights of the patients' in-home healthcare, while patients are more troubled and, in many cases, are unaware that their data is being used without their consent. MIoT technology is in its early phases, and hence a mixed qualitative and quantitative research approach will be used, which will include case studies and questionnaires in order to explore this issue and provide alternative solutions.Keywords: MIoT, data privacy, stakeholders, home healthcare, information privacy, AI
Procedia PDF Downloads 10235024 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations
Authors: Deepak Singh, Rail Kuliev
Abstract:
The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization
Procedia PDF Downloads 7035023 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 50235022 Fundamental Natural Frequency of Chromite Composite Floor System
Authors: Farhad Abbas Gandomkar, Mona Danesh
Abstract:
This paper aims to determine Fundamental Natural Frequency (FNF) of a structural composite floor system known as Chromite. To achieve this purpose, FNFs of studied panels are determined by development of Finite Element Models (FEMs) in ABAQUS program. American Institute of Steel Construction (AISC) code in Steel Design Guide Series 11, presents a fundamental formula to calculate FNF of a steel framed floor system. This formula has been used to verify results of the FEMs. The variability in the FNF of the studied system under various parameters such as dimensions of floor, boundary conditions, rigidity of main and secondary beams around the floor, thickness of concrete slab, height of composite joists, distance between composite joists, thickness of top and bottom flanges of the open web steel joists, and adding tie beam perpendicular on the composite joists, is determined. The results show that changing in dimensions of the system, its boundary conditions, rigidity of main beam, and also adding tie beam, significant changes the FNF of the system up to 452.9%, 50.8%, -52.2%, %52.6%, respectively. In addition, increasing thickness of concrete slab increases the FNF of the system up to 10.8%. Furthermore, the results demonstrate that variation in rigidity of secondary beam, height of composite joist, and distance between composite joists, and thickness of top and bottom flanges of open web steel joists insignificant changes the FNF of the studied system up to -0.02%, -3%, -6.1%, and 0.96%, respectively. Finally, the results of this study help designer predict occurrence of resonance, comfortableness, and design criteria of the studied system.Keywords: Fundamental Natural Frequency, Chromite Composite Floor System, Finite Element Method, low and high frequency floors, Comfortableness, resonance.
Procedia PDF Downloads 45735021 Life Cycle Cost Evaluation of Structures Retrofitted with Damped Cable System
Authors: Asad Naeem, Mohamed Nour Eldin, Jinkoo Kim
Abstract:
In this study, the seismic performance and life cycle cost (LCC) are evaluated of the structure retrofitted with the damped cable system (DCS). The DCS is a seismic retrofit system composed of a high-strength steel cable and pressurized viscous dampers. The analysis model of the system is first derived using various link elements in SAP2000, and fragility curves of the structure retrofitted with the DCS and viscous dampers are obtained using incremental dynamic analyses. The analysis results show that the residual displacements of the structure equipped with the DCS are smaller than those of the structure with retrofitted with only conventional viscous dampers, due to the enhanced stiffness/strength and self-centering capability of the damped cable system. The fragility analysis shows that the structure retrofitted with the DCS has the least probability of reaching the specific limit states compared to the bare structure and the structure with viscous damper. It is also observed that the initial cost of the DCS method required for the seismic retrofit is smaller than that of the structure with viscous dampers and that the LCC of the structure equipped with the DCS is smaller than that of the structure with viscous dampers.Keywords: damped cable system, fragility curve, life cycle cost, seismic retrofit, self-centering
Procedia PDF Downloads 55135020 Lead-Time Estimation Approach Using the Process Capability Index
Authors: Abdel-Aziz M. Mohamed
Abstract:
This research proposes a methodology to estimate the customer order lead time in the supply chain based on the process capability index. The cases when the process output is normally distributed and when it is not are considered. The relationships between the system capability indices in both service and manufacturing applications, delivery system reliability and the percentages of orders delivered after their promised due dates are presented. The proposed method can be used to examine the current process capability to deliver the orders before the promised lead-time. If the system was found to be incapable, the method can be used to help revise the current lead-time to a proper value according to the service reliability level selected by the management. Numerical examples and a case study describing the lead time estimation methodology and testing the system capability of delivering the orders before their promised due date are illustrated.Keywords: lead-time estimation, process capability index, delivery system reliability, statistical analysis, service achievement index, service quality
Procedia PDF Downloads 55635019 Applying the Eye Tracking Technique for the Evaluation of Oculomotor System in Patients Survived after Cerebellar Tumors
Authors: Marina Shurupova, Victor Anisimov, Alexander Latanov
Abstract:
Background: The cerebellar lesions inevitably provoke oculomotor impairments in patients of different age. Symptoms of subtentorial tumors, particularly medulloblastomas, include static and dynamic coordination disorders (ataxia, asynergia, imbalance), hypo-muscle tonus, disruption of the cranial nerves, and within the oculomotor system - nystagmus (fine or gross). Subtentorial tumors can also affect the areas of cerebellum that control the oculomotor system. The noninvasive eye-tracking technology allows obtaining multiple oculomotor characteristics such as the number of fixations and their duration, amplitude, latency and velocity of saccades, trajectory and scan path of gaze during the process of the visual field navigation. Eye tracking could be very useful in clinical studies serving as convenient and effective tool for diagnostics. The aim: We studied the dynamics of oculomotor system functioning in patients undergoing remission from cerebellar tumors removal surgeries and following neurocognitive rehabilitation. Methods: 38 children (23 boys, 15 girls, 9-17 years old) that have recovered from the cerebellar tumor-removal surgeries, radiation therapy and chemotherapy and were undergoing course of neurocognitive rehabilitation participated in the study. Two tests were carried out to evaluate oculomotor performance - gaze stability test and counting test. The monocular eye movements were recorded with eye tracker ArringtonResearch (60 Hz). Two experimental sessions with both tests were conducted before and after rehabilitation courses. Results: Within the final session of both tests we observed remarkable improvement in oculomotor performance: 1) in the gaze stability test the spread of gaze positions significantly declined compared to the first session, and 2) the visual path in counting test significantly shortened both compared to the first session. Thus, neurocognitive rehabilitation improved the functioning of the oculomotor system in patients following the cerebellar tumor removal surgeries and subsequent therapy. Conclusions: The experimental data support the effectiveness of the utilization of the eye tracking technique as diagnostic tool in the field of neurooncology.Keywords: eye tracking, rehabilitation, cerebellar tumors, oculomotor system
Procedia PDF Downloads 16135018 A Study on Approximate Controllability of Impulsive Integrodifferential Systems with Non Local Conditions
Authors: Anandhi Santhosh
Abstract:
In order to describe various real-world problems in physical and engineering sciences subject to abrupt changes at certain instants during the evolution process, impulsive differential equations has been used to describe the system model. In this article, the problem of approximate controllability for nonlinear impulsive integrodifferential equations with state-dependent delay is investigated. We study the approximate controllability for nonlinear impulsive integrodifferential system under the assumption that the corresponding linear control system is approximately controllable. Using methods of functional analysis and semigroup theory, sufficient conditions are formulated and proved. Finally, an example is provided to illustrate the proposed theory.Keywords: approximate controllability, impulsive differential system, fixed point theorem, state-dependent delay
Procedia PDF Downloads 38335017 A Case Study of Al-Shifa: A Healthcare Information System in Oman
Authors: Khamis Al-Gharbi, Said M. Gattoufi, Ali H. Al-Badi, Ali Al-Hashmi
Abstract:
The case study presents the progression of a project management of Al-Shifa, a healthcare information system in Oman. The case study describes the evolution of the implementation of a healthcare information system tailored to meet the needs of the healthcare units under the supervision of the Ministry of Health (MOH) in Oman. A focus group methodology was used for collecting the relevant information from the main project's stakeholders. In addition reports about the project made available for the researchers. The case analysis is made based on the Project Management approach developed by the Project Management Institute (PMI). The main finding that there was no formal project management approach adopted by the MOH for the development and implementation of the herewith mentioned healthcare information system project. Furthermore, the project had suffered a scope creep in terms of features, cost and time-schedule. The recommendations of the authors, for the rescue of the project from its current dilemma, consist of technological, administrative and human resources development actions.Keywords: project management, information system, healthcare, Al-Shifa, Oman
Procedia PDF Downloads 39035016 Comparative Analysis of Patent Protection between Health System and Enterprises in Shanghai, China
Authors: Na Li, Yunwei Zhang, Yuhong Niu
Abstract:
The study discussed the patent protections of health system and enterprises in Shanghai. The comparisons of technical distribution and scopes of patent protections between Shanghai health system and enterprises were used by the methods of IPC classification, co-words analysis and visual social network. Results reflected a decreasing order within IPC A61 area, namely A61B, A61K, A61M, and A61F. A61B required to be further investigated. The highest authorized patents A61B17 of A61B of IPC A61 area was found. Within A61B17, fracture fixation, ligament reconstruction, cardiac surgery, and biopsy detection were regarded as common concerned fields by Shanghai health system and enterprises. However, compared with cardiac closure which Shanghai enterprises paid attention to, Shanghai health system was more inclined to blockages and hemostatic tools. The results also revealed that the scopes of patent protections of Shanghai enterprises were relatively centralized. Shanghai enterprises had a series of comprehensive strategies for protecting core patents. In contrast, Shanghai health system was considered to be lack of strategic patent protections for core patents.Keywords: co-words analysis, IPC classification, patent protection, technical distribution
Procedia PDF Downloads 13435015 Innate Immunity of Insects in Brief
Authors: Ehsan Soleymaninejadian
Abstract:
As the field of immunology is growing day by day, and its chaotic system amazes more people, greed of research in this area is growing; however dealing with human or mammalian cells such as mice make the research expensive. Although there are some differences between higher animals with insects, importance of innate immunity during evolution made it untouched. So, for understanding the innate immunity insects can be good models. They are cheap; reproduction is fast and in the case genetics, less complicated. In this review, we tried to briefly tackle with important factors in insects’ innate immunity such as melanization, encapsulation, JAK-STAT, IMD, and Toll pathways. At the end, we explained how hormones and nerve system also can impact on immune system and make it more beautiful. In concluding remarks, the possibility of taking help from insect immune system to fight against diseases such as cancer has been considered.Keywords: insects, innate immunity, melanization, intracellular pathways, hormones
Procedia PDF Downloads 22635014 Semi-Supervised Learning for Spanish Speech Recognition Using Deep Neural Networks
Authors: B. R. Campomanes-Alvarez, P. Quiros, B. Fernandez
Abstract:
Automatic Speech Recognition (ASR) is a machine-based process of decoding and transcribing oral speech. A typical ASR system receives acoustic input from a speaker or an audio file, analyzes it using algorithms, and produces an output in the form of a text. Some speech recognition systems use Hidden Markov Models (HMMs) to deal with the temporal variability of speech and Gaussian Mixture Models (GMMs) to determine how well each state of each HMM fits a short window of frames of coefficients that represents the acoustic input. Another way to evaluate the fit is to use a feed-forward neural network that takes several frames of coefficients as input and produces posterior probabilities over HMM states as output. Deep neural networks (DNNs) that have many hidden layers and are trained using new methods have been shown to outperform GMMs on a variety of speech recognition systems. Acoustic models for state-of-the-art ASR systems are usually training on massive amounts of data. However, audio files with their corresponding transcriptions can be difficult to obtain, especially in the Spanish language. Hence, in the case of these low-resource scenarios, building an ASR model is considered as a complex task due to the lack of labeled data, resulting in an under-trained system. Semi-supervised learning approaches arise as necessary tasks given the high cost of transcribing audio data. The main goal of this proposal is to develop a procedure based on acoustic semi-supervised learning for Spanish ASR systems by using DNNs. This semi-supervised learning approach consists of: (a) Training a seed ASR model with a DNN using a set of audios and their respective transcriptions. A DNN with a one-hidden-layer network was initialized; increasing the number of hidden layers in training, to a five. A refinement, which consisted of the weight matrix plus bias term and a Stochastic Gradient Descent (SGD) training were also performed. The objective function was the cross-entropy criterion. (b) Decoding/testing a set of unlabeled data with the obtained seed model. (c) Selecting a suitable subset of the validated data to retrain the seed model, thereby improving its performance on the target test set. To choose the most precise transcriptions, three confidence scores or metrics, regarding the lattice concept (based on the graph cost, the acoustic cost and a combination of both), was performed as selection technique. The performance of the ASR system will be calculated by means of the Word Error Rate (WER). The test dataset was renewed in order to extract the new transcriptions added to the training dataset. Some experiments were carried out in order to select the best ASR results. A comparison between a GMM-based model without retraining and the DNN proposed system was also made under the same conditions. Results showed that the semi-supervised ASR-model based on DNNs outperformed the GMM-model, in terms of WER, in all tested cases. The best result obtained an improvement of 6% relative WER. Hence, these promising results suggest that the proposed technique could be suitable for building ASR models in low-resource environments.Keywords: automatic speech recognition, deep neural networks, machine learning, semi-supervised learning
Procedia PDF Downloads 33935013 Development of Soft-Core System for Heart Rate and Oxygen Saturation
Authors: Caje F. Pinto, Jivan S. Parab, Gourish M. Naik
Abstract:
This paper is about the development of non-invasive heart rate and oxygen saturation in human blood using Altera NIOS II soft-core processor system. In today's world, monitoring oxygen saturation and heart rate is very important in hospitals to keep track of low oxygen levels in blood. We have designed an Embedded System On Peripheral Chip (SOPC) reconfigurable system by interfacing two LED’s of different wavelengths (660 nm/940 nm) with a single photo-detector to measure the absorptions of hemoglobin species at different wavelengths. The implementation of the interface with Finger Probe and Liquid Crystal Display (LCD) was carried out using NIOS II soft-core system running on Altera NANO DE0 board having target as Cyclone IVE. This designed system is used to monitor oxygen saturation in blood and heart rate for different test subjects. The designed NIOS II processor based non-invasive heart rate and oxygen saturation was verified with another Operon Pulse oximeter for 50 measurements on 10 different subjects. It was found that the readings taken were very close to the Operon Pulse oximeter.Keywords: heart rate, NIOS II, oxygen saturation, photoplethysmography, soft-core, SOPC
Procedia PDF Downloads 19535012 Big Data Strategy for Telco: Network Transformation
Abstract:
Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.Keywords: big data, next generation networks, network transformation, strategy
Procedia PDF Downloads 36035011 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores
Authors: A. Ashraff
Abstract:
The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems
Procedia PDF Downloads 10635010 A webGIS Methodology to Support Sediments Management in Wallonia
Authors: Nathalie Stephenne, Mathieu Veschkens, Stéphane Palm, Christophe Charlemagne, Jacques Defoux
Abstract:
According to Europe’s first River basin Management Plans (RBMPs), 56% of European rivers failed to achieve the good status targets of the Water Framework Directive WFD. In Central European countries such as Belgium, even more than 80% of rivers failed to achieve the WFD quality targets. Although the RBMP’s should reduce the stressors and improve water body status, their potential to address multiple stress situations is limited due to insufficient knowledge on combined effects, multi-stress, prioritization of measures, impact on ecology and implementation effects. This paper describes a webGis prototype developed for the Walloon administration to improve the communication and the management of sediment dredging actions carried out in rivers and lakes in the frame of RBMPs. A large number of stakeholders are involved in the management of rivers and lakes in Wallonia. They are in charge of technical aspects (client and dredging operators, organizations involved in the treatment of waste…), management (managers involved in WFD implementation at communal, provincial or regional level) or policy making (people responsible for policy compliance or legislation revision). These different kinds of stakeholders need different information and data to cover their duties but have to interact closely at different levels. Moreover, information has to be shared between them to improve the management quality of dredging operations within the ecological system. In the Walloon legislation, leveling dredged sediments on banks requires an official authorization from the administration. This request refers to spatial information such as the official land use map, the cadastral map, the distance to potential pollution sources. The production of a collective geodatabase can facilitate the management of these authorizations from both sides. The proposed internet system integrates documents, data input, integration of data from disparate sources, map representation, database queries, analysis of monitoring data, presentation of results and cartographic visualization. A prototype of web application using the API geoviewer chosen by the Geomatic department of the SPW has been developed and discussed with some potential users to facilitate the communication, the management and the quality of the data. The structure of the paper states the why, what, who and how of this communication tool.Keywords: sediments, web application, GIS, rivers management
Procedia PDF Downloads 40535009 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets
Authors: Apkar Salatian
Abstract:
To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.Keywords: design pattern, filtering, compression, architectural design
Procedia PDF Downloads 212