Search results for: low pass filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1212

Search results for: low pass filter

312 Solar-Powered Water Purification Using Ozone and Sand Filtration

Authors: Kayla Youhanaie, Kenneth Dott, Greg Gillis-Smith

Abstract:

Access to clean water is a global challenge that affects nearly one-third of the world’s population. A lack of safe drinking water negatively affects a person’s health, safety, and economic status. However, many regions of the world that face this clean water challenge also have high solar energy potential. To address this worldwide issue and utilize available resources, a solar-powered water purification device was developed that could be implemented in communities around the world that lack access to potable water. The device uses ozone to destroy water-borne pathogens and sand filtration to filter out particulates from the water. To select the best method for this application, a quantitative energy efficiency comparison of three water purification methods was conducted: heat, UV light, and ozone. After constructing an initial prototype, the efficacy of the device was tested using agar petri dishes to test for bacteria growth in treated water samples at various time intervals after applying the device to contaminated water. The results demonstrated that the water purification device successfully removed all bacteria and particulates from the water within three minutes, making it safe for human consumption. These results, as well as the proposed design that utilizes widely available resources in target communities, suggest that the device is a sustainable solution to address the global water crisis and could improve the quality of life for millions of people worldwide.

Keywords: clean water, solar powered water purification, ozonation, sand filtration, global water crisis

Procedia PDF Downloads 50
311 Interstellar Mission to Wolf 359: Possibilities for the Future

Authors: Rajasekar Anand Thiyagarajan

Abstract:

One of the driving forces of mankind is the “le r`eve d'etoiles" or the “dream of stars", which has been the dynamo of our civilization. Since the beginning of the dawn of the civilization, mankind has looked upon the heavens with wonder and he has tried to understand the meaning of those twinkling lights. As human history has progressed, the understanding of those twinkling lights has progressed, as we now know a lot of information about stars. However, the dream of stars or the dream of reaching those stars always remains within the expectations of mankind. In fact, the needs of the civilization constantly drive for better knowledge and the capability of reaching those stars is one such way that knowledge and exultation can be achieved. This paper takes a futuristic case study of an interstellar mission to Wolf 359, which is approximately 8.3 light years away from us. In terms of galactic distances, 8.3 light years is not much, but as far as present space technology capabilities are concerned, it is next to impossible for us to reach those distances. Several studies have been conducted on various missions to Alpha Centauri and other nearby stars such as Barnard's star and Wolf 359. However, taking a more distant star such as Wolf 359 will help test the mankind's drive for interstellar exploration, as exotic means of travel are needed. This paper will take a futuristic case study of the event and various possibilities of space travel will be discussed in detail. Comprehensive tables and graphs will be given, which will depict the amount of time that will pass at each mode of travel and more importantly some idea on the cost in terms of energy as well as money will be discussed within today's context. In addition, prerequisites to an interstellar mission to Wolf 359 will be given in detail as well as a sample mission which will take place to that particular destination. Even though the possibility of such a mission is probably nonexistent for the 21st century, it is essential to do these exercises so that mankind's understanding of the universe will be increased. In addition, this paper hopes to establish some general guidelines for such an interstellar mission.

Keywords: wolf 359, interstellar mission, alpha centauri, core diameter, core length, reflector thickness enrichment, gas temperature, reflector temperature, power density, mass of the space craft, acceleration of the space craft, time expansion

Procedia PDF Downloads 396
310 Challenges of the Implementation of Real Time Online Learning in a South African Context

Authors: Thifhuriwi Emmanuel Madzunye, Patricia Harpur, Ephias Ruhode

Abstract:

A review of the pertinent literature identified a gap concerning the hindrances and opportunities accompanying the implementation of real-time online learning systems (RTOLs) in rural areas. Whilst RTOLs present a possible solution to teaching and learning issues in rural areas, little is known about the implementation of digital strategies among schools in isolated communities. This study explores associated guidelines that have the potential to inform decision-making where Internet-based education could improve educational opportunities. A systematic literature review has the potential to consolidate and focus on disparate literature served to collect interlinked data from specific sources in a structured manner. During qualitative data analysis (QDA) of selected publications via the application of a QDA tool - ATLAS.ti, the following overarching themes emerged: digital divide, educational strategy, human factors, and support. Furthermore, findings from data collection and literature review suggest that signiant factors include a lack of digital knowledge, infrastructure shortcomings such as a lack of computers, poor internet connectivity, and handicapped real-time online may limit students’ progress. The study recommends that timeous consideration should be given to the influence of the digital divide. Additionally, the evolution of educational strategy that adopts digital approaches, a focus on training of role-players and stakeholders concerning human factors, and the seeking of governmental funding and support are essential to the implementation and success of RTOLs.

Keywords: communication, digital divide, digital skills, distance, educational strategy, government, ICT, infrastructures, learners, limpopo, lukalo, network, online learning systems, political-unrest, real-time, real-time online learning, real-time online learning system, pass-rate, resources, rural area, school, support, teachers, teaching and learning and training

Procedia PDF Downloads 308
309 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security

Authors: Magdalena Musiał-Karg

Abstract:

The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.

Keywords: electronic democracy, electronic voting, security of e-voting, information and communication technology (ICT)

Procedia PDF Downloads 209
308 The Model of Learning Centre on OTOP Production Process Based on Sufficiency Economic Philosophy for Sustainable Life Quality

Authors: Napasri Suwanajote

Abstract:

The purposes of this research were to analyse and evaluate successful factors in OTOP production process for the developing of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production 2) product development 3) the community strength 4) marketing possibility and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors 2) evaluate the strategy based on Sufficiency Economic Philosophy and 3) the model of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning centre on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.

Keywords: production process, OTOP, sufficiency economic philosophy, marketing management

Procedia PDF Downloads 213
307 Abridging Pharmaceutical Analysis and Drug Discovery via LC-MS-TOF, NMR, in-silico Toxicity-Bioactivity Profiling for Therapeutic Purposing Zileuton Impurities: Need of Hour

Authors: Saurabh B. Ganorkar, Atul A. Shirkhedkar

Abstract:

The need for investigations protecting against toxic impurities though seems to be a primary requirement; the impurities which may prove non - toxic can be explored for their therapeutic potential if any to assist advanced drug discovery. The essential role of pharmaceutical analysis can thus be extended effectively to achieve it. The present study successfully achieved these objectives with characterization of major degradation products as impurities for Zileuton which has been used for to treat asthma since years. The forced degradation studies were performed to identify the potential degradation products using Ultra-fine Liquid-chromatography. Liquid-chromatography-Mass spectrometry (Time of Flight) and Proton Nuclear Magnetic Resonance Studies were utilized effectively to characterize the drug along with five major oxidative and hydrolytic degradation products (DP’s). The mass fragments were identified for Zileuton and path for the degradation was investigated. The characterized DP’s were subjected to In-Silico studies as XP Molecular Docking to compare the gain or loss in binding affinity with 5-Lipooxygenase enzyme. One of the impurity of was found to have the binding affinity more than the drug itself indicating for its potential to be more bioactive as better Antiasthmatic. The close structural resemblance has the ability to potentiate or reduce bioactivity and or toxicity. The chances of being active biologically at other sites cannot be denied and the same is achieved to some extent by predictions for probability of being active with Prediction of Activity Spectrum for Substances (PASS) The impurities found to be bio-active as Antineoplastic, Antiallergic, and inhibitors of Complement Factor D. The toxicological abilities as Ames-Mutagenicity, Carcinogenicity, Developmental Toxicity and Skin Irritancy were evaluated using Toxicity Prediction by Komputer Assisted Technology (TOPKAT). Two of the impurities were found to be non-toxic as compared to original drug Zileuton. As the drugs are purposed and repurposed effectively the impurities can also be; as they can have more binding affinity; less toxicity and better ability to be bio-active at other biological targets.

Keywords: UFLC, LC-MS-TOF, NMR, Zileuton, impurities, toxicity, bio-activity

Procedia PDF Downloads 172
306 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms

Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan

Abstract:

Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.

Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means

Procedia PDF Downloads 263
305 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement

Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini

Abstract:

Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.

Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis

Procedia PDF Downloads 113
304 FACTS Based Stabilization for Smart Grid Applications

Authors: Adel. M. Sharaf, Foad H. Gandoman

Abstract:

Nowadays, Photovoltaic-PV Farms/ Parks and large PV-Smart Grid Interface Schemes are emerging and commonly utilized in Renewable Energy distributed generation. However, PV-hybrid-Dc-Ac Schemes using interface power electronic converters usually has negative impact on power quality and stabilization of modern electrical network under load excursions and network fault conditions in smart grid. Consequently, robust FACTS based interface schemes are required to ensure efficient energy utilization and stabilization of bus voltages as well as limiting switching/fault onrush current condition. FACTS devices are also used in smart grid-Battery Interface and Storage Schemes with PV-Battery Storage hybrid systems as an elegant alternative to renewable energy utilization with backup battery storage for electric utility energy and demand side management to provide needed energy and power capacity under heavy load conditions. The paper presents a robust interface PV-Li-Ion Battery Storage Interface Scheme for Distribution/Utilization Low Voltage Interface using FACTS stabilization enhancement and dynamic maximum PV power tracking controllers. Digital simulation and validation of the proposed scheme is done using MATLAB/Simulink software environment for Low Voltage- Distribution/Utilization system feeding a hybrid Linear-Motorized inrush and nonlinear type loads from a DC-AC Interface VSC-6-pulse Inverter Fed from the PV Park/Farm with a back-up Li-Ion Storage Battery.

Keywords: AC FACTS, smart grid, stabilization, PV-battery storage, Switched Filter-Compensation (SFC)

Procedia PDF Downloads 391
303 Risk Assessment of Natural Gas Pipelines in Coal Mined Gobs Based on Bow-Tie Model and Cloud Inference

Authors: Xiaobin Liang, Wei Liang, Laibin Zhang, Xiaoyan Guo

Abstract:

Pipelines pass through coal mined gobs inevitably in the mining area, the stability of which has great influence on the safety of pipelines. After extensive literature study and field research, it was found that there are a few risk assessment methods for coal mined gob pipelines, and there is a lack of data on the gob sites. Therefore, the fuzzy comprehensive evaluation method is widely used based on expert opinions. However, the subjective opinions or lack of experience of individual experts may lead to inaccurate evaluation results. Hence the accuracy of the results needs to be further improved. This paper presents a comprehensive approach to achieve this purpose by combining bow-tie model and cloud inference. The specific evaluation process is as follows: First, a bow-tie model composed of a fault tree and an event tree is established to graphically illustrate the probability and consequence indicators of pipeline failure. Second, the interval estimation method can be scored in the form of intervals to improve the accuracy of the results, and the censored mean algorithm is used to remove the maximum and minimum values of the score to improve the stability of the results. The golden section method is used to determine the weight of the indicators and reduce the subjectivity of index weights. Third, the failure probability and failure consequence scores of the pipeline are converted into three numerical features by using cloud inference. The cloud inference can better describe the ambiguity and volatility of the results which can better describe the volatility of the risk level. Finally, the cloud drop graphs of failure probability and failure consequences can be expressed, which intuitively and accurately illustrate the ambiguity and randomness of the results. A case study of a coal mine gob pipeline carrying natural gas has been investigated to validate the utility of the proposed method. The evaluation results of this case show that the probability of failure of the pipeline is very low, the consequences of failure are more serious, which is consistent with the reality.

Keywords: bow-tie model, natural gas pipeline, coal mine gob, cloud inference

Procedia PDF Downloads 223
302 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 120
301 Mindfulness in a Secular Age: Framing and Contextualising the Conversation in the Irish Context

Authors: Thomas P. Carroll

Abstract:

The phenomenon of mindfulness has become ever more popular in an increasingly pluralist Western society. Mindfulness practice has penetrated secular contexts that would otherwise be closed to religious influence, including state schools, hospitals, and commerce. The contemporary understanding of mindfulness has its origins in Buddhist meditation. However, since Jon Kabat-Zinn’s pioneering work in Mindfulness-Based Interventions, the concept has developed and sometimes mutated into various forms of practice which are disembedded from their original spiritual philosophy. This project will explore the spiritual climate within which mindfulness is currently flourishing through dialogue with three interlocutors. The first interlocutor is the Canadian philosopher Charles Taylor whose seminal work, ‘A Secular Age’, outlines three distinct modes of secularity. Taylor examines how the conditions of belief have changed and how the self seeks meaning in an age where belief in the divine is no longer axiomatic. The next interlocutor is Czech theologian and psychotherapist Tomáš Halík who offers a unique perspective of a Catholic who belongs to a section of society outnumbered by secular counterparts, with a theological hermeneutic best described as 'Den Fremden verstehen- understanding the stranger'. Finally, Irish theologian Michael Paul Gallagher offers a theological perspective on how the Christian faith can be translated into dialogue with Irish secular culture, as well as addressing the crisis of imagination and culture rather than the crisis of faith in Ireland. These interlocutors will illustrate that there are sometimes striking differences in how to interpret the religious signs of the times. However, these approaches also reveal significant similarities in how they address and explore the meaning of religious belief and experience today. In this way, themes will emerge that will help to frame the conversation about mindfulness in the West. These themes will include; the failure of the secularization thesis to pass, the growth of a diverse marketplace of religions and beliefs and the growth of a demographic who identify as spiritual but not religious. Such research is paramount in enabling a richer dialogue between Christian faith and mindfulness in a fragmented, postmodern Western context.

Keywords: culture, mindfulness, secularism, spirituality

Procedia PDF Downloads 88
300 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 94
299 Mitigating Biofouling on Reverse Osmosis Membranes: Applying Greener Preservatives to Biofilm Treatment

Authors: Anna Curtin, Matthew Thibodeau, Heather Buckley

Abstract:

Water scarcity is characterized by a lack of access to clean and affordable drinking water, as well as water for hygienic and economic needs. The amount of people effected by water scarcity is expected to increase in the coming years due to climate change, population growth, and pollution, amongst other things. In response, scientists are pursuing cost effective drinking water treatment methods, often with a focus on alternative water sources. Desalination of seawater via reverse osmosis is one promising alternative method. Desalination of seawater via reverse osmosis, however, is limited significantly by biofouling of the filtration membrane. Biofouling is the buildup of microorganisms in a biofilm at the water-membrane interface. It clogs the membrane, decreasing the efficiency of filtration, consequently increasing operational and maintenance costs. Although effective, existing chemical treatment methods can damage the membrane, decreasing the lifespan of the membrane; create antibiotic resistance; and cause harm to humans and the environment if they pass through the membrane into the permeate. The current project focuses on applying safer preservatives used in home and personal care products to RO membranes to investigate the biofouling treatment efficacy. Currently, many of these safer preservatives have only been tested on cells in planktonic phase in suspension cultures, not on cells in biofilms. The results of suspension culture tests are not applicable to biofouling scenarios because organisms in planktonic phase in suspension cultures exhibit different morphological, chemical, and metabolic characteristics than those in a biofilm. Testing antifoulant efficacy of safer preservatives on biofilms will provide more applicable results to biofouling on RO membranes. To do this, biofilms will be grown on 96-well-plates and minimum inhibitory concentrations (MIC90) and log-reductions will be calculated for various safer preservatives. Results from these tests will be used to guide doses for tests of safer preservatives in a bench-scale RO system.

Keywords: reverse osmosis, biofouling, preservatives, antimicrobial, safer alternative, green chemistry

Procedia PDF Downloads 121
298 Problem Solving: Process or Product? A Mathematics Approach to Problem Solving in Knowledge Management

Authors: A. Giannakopoulos, S. B. Buckley

Abstract:

Problem solving in any field is recognised as a prerequisite for any advancement in knowledge. For example in South Africa it is one of the seven critical outcomes of education together with critical thinking. As a systematic way to problem solving was initiated in mathematics by the great mathematician George Polya (the father of problem solving), more detailed and comprehensive ways in problem solving have been developed. This paper is based on the findings by the author and subsequent recommendations for further research in problem solving and critical thinking. Although the study was done in mathematics, there is no doubt by now in almost anyone’s mind that mathematics is involved to a greater or a lesser extent in all fields, from symbols, to variables, to equations, to logic, to critical thinking. Therefore it stands to reason that mathematical principles and learning cannot be divorced from any field. In management of knowledge situations, the types of problems are similar to mathematics problems varying from simple to analogical to complex; from well-structured to ill-structured problems. While simple problems could be solved by employees by adhering to prescribed sequential steps (the process), analogical and complex problems cannot be proceduralised and that diminishes the capacity of the organisation of knowledge creation and innovation. The low efficiency in some organisations and the low pass rates in mathematics prompted the author to view problem solving as a product. The authors argue that using mathematical approaches to knowledge management problem solving and treating problem solving as a product will empower the employee through further training to tackle analogical and complex problems. The question the authors asked was: If it is true that problem solving and critical thinking are indeed basic skills necessary for advancement of knowledge why is there so little literature of knowledge management (KM) about them and how they are connected and advance KM?This paper concludes with a conceptual model which is based on general accepted principles of knowledge acquisition (developing a learning organisation), knowledge creation, sharing, disseminating and storing thereof, the five pillars of knowledge management (KM). This model, also expands on Gray’s framework on KM practices and problem solving and opens the doors to a new approach to training employees in general and domain specific areas problems which can be adapted in any type of organisation.

Keywords: critical thinking, knowledge management, mathematics, problem solving

Procedia PDF Downloads 565
297 A Meta-Analysis of the Academic Achievement of Students With Emotional/Behavioral Disorders in Traditional Public Schools in the United States

Authors: Dana Page, Erica McClure, Kate Snider, Jenni Pollard, Tim Landrum, Jeff Valentine

Abstract:

Extensive research has been conducted on students with emotional and behavioral disorders (EBD) and their rates of challenging behavior. In the past, however, less attention has been given to their academic achievement and outcomes. Recent research examining outcomes for students with EBD has indicated that these students receive lower grades, are less likely to pass classes, and experience higher rates of school dropout than students without disabilities and students with other high incidence disabilities. Given that between 2% and 20% of the school-age population is likely to have EBD (though many may not be identified as such), this is no small problem. Despite the need for increased examination of this population’s academic achievement, research on the actual performance of students with EBD has been minimal. This study reports the results of a meta-analysis of the limited research examining academic achievement of students with EBD, including effect sizes of assessment scores and discussion of moderators potentially impacting academic outcomes. Researchers conducted a thorough literature search to identify potentially relevant documents before screening studies for inclusion in the systematic review. Screening identified 35 studies that reported results of academic assessment scores for students with EBD. These studies were then coded to extract descriptive data across multiple domains, including placement of students, participant demographics, and academic assessment scores. Results indicated possible collinearity between EBD disability status and lower academic assessment scores, despite a lack of association between EBD eligibility and lower cognitive ability. Quantitative analysis of assessment results yielded effect sizes for academic achievement of student participants, indicating lower performance levels and potential moderators (e.g., race, socioeconomic status, and gender) impacting student academic performance. In addition to discussing results of the meta-analysis, implications and areas for future research, policy, and practice are discussed.

Keywords: students with emotional behavioral disorders, academic achievement, systematic review, meta-analysis

Procedia PDF Downloads 39
296 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery

Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang

Abstract:

Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.

Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram

Procedia PDF Downloads 45
295 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 167
294 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 301
293 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities

Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos

Abstract:

The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.

Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification

Procedia PDF Downloads 452
292 Ergonomic Adaptations in Visually Impaired Workers - A Literature Review

Authors: Kamila Troper, Pedro Mestre, Maria Lurdes Menano, Joana Mendonça, Maria João Costa, Sandra Demel

Abstract:

Introduction: Visual impairment is a problem that has an influence on hundreds of thousands of people all over the world. Although it is possible for a Visually Impaired person to do most jobs, the right training, technological assistance, and emotional support are essential. Ergonomics be able to solve many of the problems/issues with the relative ease of positioning, lighting and design of the workplace. A little forethought can make a tremendous difference to the ease with which a person with an impairment function. Objectives: Review the main ergonomic adaptation measures reported in the literature in order to promote better working conditions and safety measures for the visually impaired. Methodology: This was an exploratory-descriptive, qualitative literature systematic review study. The main databases used were: PubMed, BIREME, LILACS, with articles and studies published between 2000 and 2021. Results: Based on the principles of the theoretical references of ergonomic analysis of work, the main restructuring of the physical space of the workstations were: Accessibility facilities and assistive technologies; A screen reader that captures information from a computer and sends it in real-time to a speech synthesizer or Braille terminal; Installations of software with voice recognition, Monitors with enlarged screens; Magnification software; Adequate lighting, magnifying lenses in addition to recommendations regarding signage and clearance of the places where the visually impaired pass through. Conclusions: Employability rates for people with visual impairments(both those who are blind and those who have low vision)are low and continue to be a concern to the world and for researchers as a topic of international interest. Although numerous authors have identified barriers to employment and proposed strategies to remediate or circumvent those barriers, people with visual impairments continue to experience high rates of unemployment.

Keywords: ergonomic adaptations, visual impairments, ergonomic analysis of work, systematic review

Procedia PDF Downloads 158
291 Towards Developing A Rural South African Child Into An Engineering Graduates With Conceptual And Critical Thinking Skills

Authors: Betty Kibirige

Abstract:

Students entering the University of Zululand (UNIZULU) Science Faculty mostly come with skills that allow them to prepare for exams and pass them in order to satisfy requirements for entry into a tertiary Institution. Some students hail from deep rural schools with limited facilities, while others come from well-resourced schools. Personal experience has shown that it may take a student the whole time at a tertiary institution following the same skills as those acquired in high school as a sure means of entering the next level in their development, namely a postgraduate program. While it is apparent that at this point in human history, it is totally impossible to teach all the possible content in any one subject, many academics approach teaching and learning from the traditional point of view. It therefore became apparent to explore ways of developing a graduate that will be able to approach life with skills that allows them to navigate knowledge by applying conceptual and critical thinking skills. Recently, the Science Faculty at the University of Zululand introduced two Engineering programs. In an endeavour to approach the development of the Engineering graduate in this institution to be able to tackle problem-solving in the present-day excessive information availability, it became necessary to study and review approaches used by various academics in order to settle for a possible best approach to the challenge at hand. This paper focuses on the development of a deep rural child in a graduate with conceptual and critical thinking skills as major attributes possessed upon graduation. For this purpose, various approaches were studied. A combination of these approaches was repackaged to form an approach that may appear novel to UNIZULU and the rural child, especially for the Engineering discipline. The approach was checked by offering quiz questions to students participating in an engineering module, observing test scores in the targeted module and make comparative studies. Test results are discussed in the article. It was concluded that students’ graduate attributes could be tailored subconsciously to indeed include conceptual and critical thinking skills, but through more than one approach depending mainly on the student's high school background.

Keywords: graduate attributes, conceptual skills, critical thinking skills, traditional approach

Procedia PDF Downloads 214
290 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering

Authors: Hong Yu, Ion Matei

Abstract:

Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.

Keywords: carbon composite, fault detection, fault identification, particle filter

Procedia PDF Downloads 174
289 Risk-Sharing Financing of Islamic Banks: Better Shielded against Interest Rate Risk

Authors: Mirzet SeHo, Alaa Alaabed, Mansur Masih

Abstract:

In theory, risk-sharing-based financing (RSF) is considered a corner stone of Islamic finance. It is argued to render Islamic banks more resilient to shocks. In practice, however, this feature of Islamic financial products is almost negligible. Instead, debt-based instruments, with conventional like features, have overwhelmed the nascent industry. In addition, the framework of present-day economic, regulatory and financial reality inevitably exposes Islamic banks in dual banking systems to problems of conventional banks. This includes, but is not limited to, interest rate risk. Empirical evidence has, thus far, confirmed such exposures, despite Islamic banks’ interest-free operations. This study applies system GMM in modeling the determinants of RSF, and finds that RSF is insensitive to changes in interest rates. Hence, our results provide support to the “stability” view of risk-sharing-based financing. This suggests RSF as the way forward for risk management at Islamic banks, in the absence of widely acceptable Shariah compliant hedging instruments. Further support to the stability view is given by evidence of counter-cyclicality. Unlike debt-based lending that inflates artificial asset bubbles through credit expansion during the upswing of business cycles, RSF is negatively related to GDP growth. Our results also imply a significantly strong relationship between risk-sharing deposits and RSF. However, the pass-through of these deposits to RSF is economically low. Only about 40% of risk-sharing deposits are channeled to risk-sharing financing. This raises questions on the validity of the industry’s claim that depositors accustomed to conventional banking shun away from risk sharing and signals potential for better balance sheet management at Islamic banks. Overall, our findings suggest that, on the one hand, Islamic banks can gain ‘independence’ from conventional banks and interest rates through risk-sharing products, the potential for which is enormous. On the other hand, RSF could enable policy makers to improve systemic stability and restrain excessive credit expansion through its countercyclical features.

Keywords: Islamic banks, risk-sharing, financing, interest rate, dynamic system GMM

Procedia PDF Downloads 294
288 Meditation and Insight Interpretation Using Quantum Circle Based-on Experiment and Quantum Relativity Formalism

Authors: Somnath Bhattachryya, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin

Abstract:

In this study and research on meditation and insight, the design and experiment with electronic circuits to manipulate the meditators' mental circles that call the chakras to have the same size is proposed. The shape of the circuit is 4-ports, called an add-drop multiplexer, that studies the meditation structure called the four-mindfulness foundation, then uses an AC power signal as an input instead of the meditation time function, where various behaviors with the method of re-filtering the signal (successive filtering), like eight noble paths. Start by inputting a signal at a frequency that causes the velocity of the wave on the perimeter of the circuit to cause particles to have the speed of light in a vacuum. The signal changes from electromagnetic waves and matter waves according to the velocity (frequency) until it reaches the point of the relativistic limit. The electromagnetic waves are transformed into photons with properties of wave-particle overcoming the limits of the speed of light. As for the matter wave, it will travel to the other side and cannot pass through the relativistic limit, called a shadow signal (echo) that can have power from increasing speed but cannot create speed faster than light or insight. In the experiment, the only the side where the velocity is positive, only where the speed above light or the corresponding frequency indicates intelligence. Other side(echo) can be done by changing the input signal to the other side of the circuit to get the same result. But there is no intelligence or speed beyond light. It is also used to study the stretching, contraction of time and wormholes that can be applied for teleporting, Bose-Einstein condensate and teleprinting, quantum telephone. The teleporting can happen throughout the system with wave-particle and echo, which is when the speed of the particle is faster than the stretching or contraction of time, the particle will submerge in the wormhole, when the destination and time are determined, will travel through the wormhole. In a wormhole, time can determine in the future and the past. The experimental results using the microstrip circuit have been found to be by the principle of quantum relativity, which can be further developed for both tools and meditation practitioners for quantum technology.

Keywords: quantu meditation, insight picture, quantum circuit, absolute time, teleportation

Procedia PDF Downloads 38
287 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 437
286 Language Choice and Language Maintenance of Northeastern Thai Staff in Suan Sunandha Rajabhat University

Authors: Napasri Suwanajote

Abstract:

The purposes of this research were to analyze and evaluate successful factors in OTOP production process for the developing of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production, 2) product development, 3) the community strength, 4) marketing possibility, and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors, 2) evaluate the strategy based on Sufficiency Economic Philosophy, and 3) the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.

Keywords: production process, OTOP, sufficiency economic philosophy, language choice

Procedia PDF Downloads 209
285 Ecological-Economics Evaluation of Water Treatment Systems

Authors: Hwasuk Jung, Seoi Lee, Dongchoon Ryou, Pyungjong Yoo, Seokmo Lee

Abstract:

The Nakdong River being used as drinking water sources for Pusan metropolitan city has the vulnerability of water management due to the fact that industrial areas are located in the upper Nakdong River. Most citizens of Busan think that the water quality of Nakdong River is not good, so they boil or use home filter to drink tap water, which causes unnecessary individual costs to Busan citizens. We need to diversify water intake to reduce the cost and to change the weak water source. Under this background, this study was carried out for the environmental accounting of Namgang dam water treatment system compared to Nakdong River water treatment system by using emergy analysis method to help making reasonable decision. Emergy analysis method evaluates quantitatively both natural environment and human economic activities as an equal unit of measure. The emergy transformity of Namgang dam’s water was 1.16 times larger than that of Nakdong River’s water. Namgang Dam’s water shows larger emergy transformity than that of Nakdong River’s water due to its good water quality. The emergy used in making 1 m3 tap water from Namgang dam water treatment system was 1.26 times larger than that of Nakdong River water treatment system. Namgang dam water treatment system shows larger emergy input than that of Nakdong river water treatment system due to its construction cost of new pipeline for intaking Namgang daw water. If the Won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.66. If the Em-won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.26. The cost-benefit ratio of Em-won was smaller than that of Won. When we use emergy analysis, which considers the benefit of a natural environment such as good water quality of Namgang dam, Namgang dam water treatment system could be a good alternative for diversifying intake source.

Keywords: emergy, emergy transformity, Em-won, water treatment system

Procedia PDF Downloads 276
284 Analysis of Waiting Time and Drivers Fatigue at Manual Toll Plaza and Suggestion of an Automated Toll Tax Collection System

Authors: Muhammad Dawood Idrees, Maria Hafeez, Arsalan Ansari

Abstract:

Toll tax collection is the earliest method of tax collection and revenue generation. This revenue is utilized for the development of roads networks, maintenance, and connecting to roads and highways across the country. Pakistan is one of the biggest countries, covers a wide area of land, roads networks, and motorways are important source of connecting cities. Every day millions of people use motorways, and they have to stop at toll plazas to pay toll tax as majority of toll plazas are manually collecting toll tax. The purpose of this study is to calculate the waiting time of vehicles at Karachi Hyderabad (M-9) motorway. As Karachi is the biggest city of Pakistan and hundreds of thousands of people use this route to approach other cities. Currently, toll tax collection is manual system which is a major cause for long time waiting at toll plaza. This study calculates the waiting time of vehicles, fuel consumed in waiting time, manpower employed at toll plaza as all process is manual, and it also leads to mental and physical fatigue of driver. All wastages of sources are also calculated, and a most feasible automatic toll tax collection system is proposed which is not only beneficial to reduce waiting time but also beneficial in reduction of fuel, reduction of manpower employed, and reduction in physical and mental fatigue. A cost comparison in terms of wastages is also shown between manual and automatic toll tax collection system (E-Z Pass). Results of this study reveal that, if automatic tool collection system is implemented at Karachi to Hyderabad motorway (M-9), there will be a significance reduction in waiting time of vehicles, which leads to reduction of fuel consumption, environmental pollution, mental and physical fatigue of driver. All these reductions are also calculated in terms of money (Pakistani rupees) and it is obtained that millions of rupees can be saved by using automatic tool collection system which will lead to improve the economy of country.

Keywords: toll tax collection, waiting time, wastages, driver fatigue

Procedia PDF Downloads 125
283 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images

Authors: Sophia Shi

Abstract:

Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.

Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG

Procedia PDF Downloads 107