Search results for: improved sparrow search algorithm
7325 Algorithms Minimizing Total Tardiness
Authors: Harun Aydilek, Asiye Aydilek, Ali Allahverdi
Abstract:
The total tardiness is a widely used performance measure in the scheduling literature. This performance measure is particularly important in situations where there is a cost to complete a job beyond its due date. The cost of scheduling increases as the gap between a job's due date and its completion time increases. Such costs may also be penalty costs in contracts, loss of goodwill. This performance measure is important as the fulfillment of due dates of customers has to be taken into account while making scheduling decisions. The problem is addressed in the literature, however, it has been assumed zero setup times. Even though this assumption may be valid for some environments, it is not valid for some other scheduling environments. When setup times are treated as separate from processing times, it is possible to increase machine utilization and to reduce total tardiness. Therefore, non-zero setup times need to be considered as separate. A dominance relation is developed and several algorithms are proposed. The developed dominance relation is utilized in the proposed algorithms. Extensive computational experiments are conducted for the evaluation of the algorithms. The experiments indicated that the developed algorithms perform much better than the existing algorithms in the literature. More specifically, one of the newly proposed algorithms reduces the error of the best existing algorithm in the literature by 40 percent.Keywords: algorithm, assembly flowshop, dominance relation, total tardiness
Procedia PDF Downloads 3547324 Alleviation of Adverse Effects of Salt Stress on Soybean (Glycine max. L.) by Using Osmoprotectants and Compost Application
Authors: Ayman El Sabagh, SobhySorour, AbdElhamid Omar, Adel Ragab, Mohammad Sohidul Islam, Celaleddin Barutçular, Akihiro Ueda, Hirofumi Saneoka
Abstract:
Salinity is one of the major factors limiting crop production in an arid environment. What adds to the concern is that all the legume crops are sensitive to increasing soil salinity. So it is implacable to either search for salinity enhancement of legume plants. The exogenous of osmoprotectants has been found effective in reducing the adverse effects of salinity stress on plant growth. Despite its global importance soybean production suffer the problems of salinity stress causing damages at plant development. Therefore, in the current study we try to clarify the mechanism that might be involved in the ameliorating effects of osmo-protectants such as proline and glycine betaine and compost application on soybean plants grown under salinity stress. Experiments were carried out in the greenhouse of the experimental station, plant nutritional physiology, Hiroshima University, Japan in 2011- 2012. The experiment was arranged in a factorial design with 4 replications at NaCl concentrations (0 and 15 mM). The exogenous, proline and glycine betaine concentrations (0 mM and 25 mM) for each. Compost treatments (0 and 24 t ha-1). Results indicated that salinity stress induced reduction in all growth and physiological parameters (dry weights plant-1, chlorophyll content, N and K+ content) likewise, seed and quality traits of soybean plant compared with those of the unstressed plants. In contrast, salinity stress led to increases in the electrolyte leakage ratio, Na and proline contents. Thus tolerance against salt stress was observed, the improvement of salt tolerance resulted from proline, glycine betaine and compost were accompanied with improved membrane stability, K+, and proline accumulation on contrary, decreased Na+ content. These results clearly demonstrate that could be used to reduce the harmful effect of salinity on both physiological aspects and growth parameters of soybean. They are capable of restoring yield potential and quality of seed and may be useful in agronomic situations where saline conditions are diagnosed as a problem. Consequently, exogenous osmo-protectants combine with compost will effectively solve seasonal salinity stress problem and are a good strategy to increase salinity resistance in the drylands.Keywords: compost, glycine betaine, proline, salinity tolerance, soybean
Procedia PDF Downloads 3737323 Discovery of Exoplanets in Kepler Data Using a Graphics Processing Unit Fast Folding Method and a Deep Learning Model
Authors: Kevin Wang, Jian Ge, Yinan Zhao, Kevin Willis
Abstract:
Kepler has discovered over 4000 exoplanets and candidates. However, current transit planet detection techniques based on the wavelet analysis and the Box Least Squares (BLS) algorithm have limited sensitivity in detecting minor planets with a low signal-to-noise ratio (SNR) and long periods with only 3-4 repeated signals over the mission lifetime of 4 years. This paper presents a novel precise-period transit signal detection methodology based on a new Graphics Processing Unit (GPU) Fast Folding algorithm in conjunction with a Convolutional Neural Network (CNN) to detect low SNR and/or long-period transit planet signals. A comparison with BLS is conducted on both simulated light curves and real data, demonstrating that the new method has higher speed, sensitivity, and reliability. For instance, the new system can detect transits with SNR as low as three while the performance of BLS drops off quickly around SNR of 7. Meanwhile, the GPU Fast Folding method folds light curves 25 times faster than BLS, a significant gain that allows exoplanet detection to occur at unprecedented period precision. This new method has been tested with all known transit signals with 100% confirmation. In addition, this new method has been successfully applied to the Kepler of Interest (KOI) data and identified a few new Earth-sized Ultra-short period (USP) exoplanet candidates and habitable planet candidates. The results highlight the promise for GPU Fast Folding as a replacement to the traditional BLS algorithm for finding small and/or long-period habitable and Earth-sized planet candidates in-transit data taken with Kepler and other space transit missions such as TESS(Transiting Exoplanet Survey Satellite) and PLATO(PLAnetary Transits and Oscillations of stars).Keywords: algorithms, astronomy data analysis, deep learning, exoplanet detection methods, small planets, habitable planets, transit photometry
Procedia PDF Downloads 2257322 GPU Accelerated Fractal Image Compression for Medical Imaging in Parallel Computing Platform
Authors: Md. Enamul Haque, Abdullah Al Kaisan, Mahmudur R. Saniat, Aminur Rahman
Abstract:
In this paper, we have implemented both sequential and parallel version of fractal image compression algorithms using CUDA (Compute Unified Device Architecture) programming model for parallelizing the program in Graphics Processing Unit for medical images, as they are highly similar within the image itself. There is several improvements in the implementation of the algorithm as well. Fractal image compression is based on the self similarity of an image, meaning an image having similarity in majority of the regions. We take this opportunity to implement the compression algorithm and monitor the effect of it using both parallel and sequential implementation. Fractal compression has the property of high compression rate and the dimensionless scheme. Compression scheme for fractal image is of two kinds, one is encoding and another is decoding. Encoding is very much computational expensive. On the other hand decoding is less computational. The application of fractal compression to medical images would allow obtaining much higher compression ratios. While the fractal magnification an inseparable feature of the fractal compression would be very useful in presenting the reconstructed image in a highly readable form. However, like all irreversible methods, the fractal compression is connected with the problem of information loss, which is especially troublesome in the medical imaging. A very time consuming encoding process, which can last even several hours, is another bothersome drawback of the fractal compression.Keywords: accelerated GPU, CUDA, parallel computing, fractal image compression
Procedia PDF Downloads 3367321 New Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator
Authors: Wedad Albalawi
Abstract:
The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques, and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then, dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is an arbitrary nonempty closed subset of the real numbers. Then, the dynamic inequalities on time scales have received a lot of attention in the literature and has become a major field in pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on Hardy and Coposon inequalities, using Steklov operator on time scale in double integrals to obtain special cases of time-scale inequalities of Hardy and Copson on high dimensions. The advantage of this study is that it uses the one-dimensional classical Hardy inequality to obtain higher dimensional on time scale versions that will be applied in the solution of the Cauchy problem for the wave equation. In addition, the obtained inequalities have various applications involving discontinuous domains such as bug populations, phytoremediation of metals, wound healing, maximization problems. The proof can be done by introducing restriction on the operator in several cases. The concepts in time scale version such as time scales calculus will be used that allows to unify and extend many problems from the theories of differential and of difference equations. In addition, using chain rule, and some properties of multiple integrals on time scales, some theorems of Fubini and the inequality of H¨older.Keywords: time scales, inequality of hardy, inequality of coposon, steklov operator
Procedia PDF Downloads 957320 An Exploration of The Patterns of Transcendence in Indian and Hopkins’s Aesthetics
Authors: Lima Antony
Abstract:
In G. M. Hopkins’s poetics and aesthetics there is scope for a comparative study with Indian discourses on aesthetics, an area not adequately explored so far. This exploration will enrich the field of comparative study of diverse cultural expressions and their areas of similarity. A comparative study of aesthetic and religious experiences in diverse cultures will open up avenues for the discovery of similarities in self-experiences and their transcendence. Such explorations will reveal similar patterns in aesthetic and religious experiences. The present paper intends to prove this in the theories of Hopkins and Indian aesthetics. From the time of the Vedas Indian sages have believed that aesthetic enjoyment could develop into a spiritual realm. From the Natyasastra of Bharata, Indian aesthetics develops and reaches its culmination in later centuries into a consciousness of union with the mystery of the Ultimate Being, especially in Dhvanaāloka of Anandavardhana and Locana of Abhinavagupta. Dhvanyaloka elaborates the original ideas of rasa (mood or flavor) and dhvani (power of suggestion) in Indian literary theory and aesthetics. Hopkins was successful, like the ancient Indian alankarikas, in creating aesthetically superb patterns at various levels of sound and sense for which he coined the term ‘inscape’. So Hopkins’s aesthetic theory becomes suitable for transcultural comparative study with Indian aesthetics especially the dhvani theories of Anandavardhana and Abhinavagupta. Hopkins’s innovative approach to poetics and his selection of themes are quite suitable for analysis in the light of Indian literary theories. Indian philosophy views the ultimate reality called Brahman, as the 'soul,' or inner essence, of all reality. We see in Hopkins also a search for the essence of things and the chiming of their individuality with the Ultimate Being in multidimensional patterns of sound, sense and ecstatic experience. This search culminates in the realization of a synthesis of the individual self with the Ultimate Being. This is achieved through an act of surrender of the individuality of the self before the Supreme Being. Attempts to reconcile the immanent and transcendent aspects of the Ultimate Being can be traced in the Indian as well as Hopkins’s aesthetics which can contribute to greater understanding and harmony between cultures.Keywords: Dhvani, Indian aesthetics, transcultural studies, Rasa
Procedia PDF Downloads 1487319 Evaluating the Impact of a Child Sponsorship Program on Paediatric Health and Development in Calauan, Philippines: A Retrospective Audit
Authors: Daniel Faraj, Arabella Raupach, Charlotte Hespe, Helen Wilcox, Kristie-Lee Anning
Abstract:
Aim: International child sponsorship programs comprise a considerable proportion of global aid accessible to the general population. Team Philippines (TP), a healthcare and welfare initiative run in association with the University of Notre Dame Sydney since 2013, leads a holistic sponsorship program for thirty children from Calauan, Philippines. To date, empirical research has not been performed on the overall success and impact of the TP child sponsorship program. As such, this study aims to evaluate its effectiveness in improving pediatric outcomes. Methods: Study cohorts comprised thirty sponsored and twenty-nine age- and gender-matched non-sponsored children. Data were extracted from the TP Medical Director database and lifestyle questionnaires for July-November 2019. Outcome measures included anthropometry, markers of medical health, dental health, exercise, and diet. Statistical analyses were performed in SPSS. Results: Sponsorship resulted in fewer medical diagnoses and prescription medications, superior dental health, and improved diet. Further, sponsored children may show a clinically significant trend toward improved physical health. Sponsorship did not affect growth and development metrics or levels of physical activity. Conclusions: The TP child sponsorship program significantly impacts positive pediatric health outcomes in the Calauan community. The strength of the program lies in its holistic, sustainable, and community-based model, which is enabled by effective international child sponsorship. This study further supports the relationship between supporting early livelihood and improved health in the pediatric population.Keywords: child health, public health, health status disparities, healthcare disparities, social determinants of health, morbidity, community health services, culturally competent care, medically underserved areas, population health management, Philippines
Procedia PDF Downloads 1107318 Design of Low Latency Multiport Network Router on Chip
Authors: P. G. Kaviya, B. Muthupandian, R. Ganesan
Abstract:
On-chip routers typically have buffers are used input or output ports for temporarily storing packets. The buffers are consuming some router area and power. The multiple queues in parallel as in VC router. While running a traffic trace, not all input ports have incoming packets needed to be transferred. Therefore large numbers of queues are empty and others are busy in the network. So the time consumption should be high for the high traffic. Therefore using a RoShaQ, minimize the buffer area and time The RoShaQ architecture was send the input packets are travel through the shared queues at low traffic. At high load traffic the input packets are bypasses the shared queues. So the power and area consumption was reduced. A parallel cross bar architecture is proposed in this project in order to reduce the power consumption. Also a new adaptive weighted routing algorithm for 8-port router architecture is proposed in order to decrease the delay of the network on chip router. The proposed system is simulated using Modelsim and synthesized using Xilinx Project Navigator.Keywords: buffer, RoShaQ architecture, shared queue, VC router, weighted routing algorithm
Procedia PDF Downloads 5427317 Locomotion Effects of Redundant Degrees of Freedom in Multi-Legged Quadruped Robots
Authors: Hossein Keshavarz, Alejandro Ramirez-Serrano
Abstract:
Energy efficiency and locomotion speed are two key parameters for legged robots; thus, finding ways to improve them are important. This paper proposes a locomotion framework to analyze the energy usage and speed of quadruped robots via a Genetic Algorithm (GA) optimization process. For this, a quadruped robot platform with joint redundancy in its hind legs that we believe will help multi-legged robots improve their speed and energy consumption is used. ContinuO, the quadruped robot of interest, has 14 active degrees of freedom (DoFs), including three DoFs for each front leg, and unlike previously developed quadruped robots, four DoFs for each hind leg. ContinuO aims to realize a cost-effective quadruped robot for real-world scenarios with high speeds and the ability to overcome large obstructions. The proposed framework is used to locomote the robot and analyze its energy consumed at diverse stride lengths and locomotion speeds. The analysis is performed by comparing the obtained results in two modes, with and without the joint redundancy on the robot’s hind legs.Keywords: genetic algorithm optimization, locomotion path planning, quadruped robots, redundant legs
Procedia PDF Downloads 1047316 Fault Diagnosis and Fault-Tolerant Control of Bilinear-Systems: Application to Heating, Ventilation, and Air Conditioning Systems in Multi-Zone Buildings
Authors: Abderrhamane Jarou, Dominique Sauter, Christophe Aubrun
Abstract:
Over the past decade, the growing demand for energy efficiency in buildings has attracted the attention of the control community. Failures in HVAC (heating, ventilation and air conditioning) systems in buildings can have a significant impact on the desired and expected energy performance of buildings and on the user's comfort as well. FTC is a recent technology area that studies the adaptation of control algorithms to faulty operating conditions of a system. The application of Fault-Tolerant Control (FTC) in HVAC systems has gained attention in the last two decades. The objective is to maintain the variations in system performance due to faults within an acceptable range with respect to the desired nominal behavior. This paper considers the so-called active approach, which is based on fault and identification scheme combined with a control reconfiguration algorithm that consists in determining a new set of control parameters so that the reconfigured performance is "as close as possible, "in some sense, to the nominal performance. Thermal models of buildings and their HVAC systems are described by non-linear (usually bi-linear) equations. Most of the works carried out so far in FDI (fault diagnosis and isolation) or FTC consider a linearized model of the studied system. However, this model is only valid in a reduced range of variation. This study presents a new fault diagnosis (FD) algorithm based on a bilinear observer for the detection and accurate estimation of the magnitude of the HVAC system failure. The main contribution of the proposed FD algorithm is that instead of using specific linearized models, the algorithm inherits the structure of the actual bilinear model of the building thermal dynamics. As an immediate consequence, the algorithm is applicable to a wide range of unpredictable operating conditions, i.e., weather dynamics, outdoor air temperature, zone occupancy profile. A bilinear fault detection observer is proposed for a bilinear system with unknown inputs. The residual vector in the observer design is decoupled from the unknown inputs and, under certain conditions, is made sensitive to all faults. Sufficient conditions are given for the existence of the observer and results are given for the explicit computation of observer design matrices. Dedicated observer schemes (DOS) are considered for sensor FDI while unknown input bilinear observers are considered for actuator or system components FDI. The proposed strategy for FTC works as follows: At a first level, FDI algorithms are implemented, making it also possible to estimate the magnitude of the fault. Once the fault is detected, the fault estimation is then used to feed the second level and reconfigure the control low so that that expected performances are recovered. This paper is organized as follows. A general structure for fault-tolerant control of buildings is first presented and the building model under consideration is introduced. Then, the observer-based design for Fault Diagnosis of bilinear systems is studied. The FTC approach is developed in Section IV. Finally, a simulation example is given in Section V to illustrate the proposed method.Keywords: bilinear systems, fault diagnosis, fault-tolerant control, multi-zones building
Procedia PDF Downloads 1727315 On the Influence of the Metric Space in the Critical Behavior of Magnetic Temperature
Authors: J. C. Riaño-Rojas, J. D. Alzate-Cardona, E. Restrepo-Parra
Abstract:
In this work, a study of generic magnetic nanoparticles varying the metric space is presented. As the metric space is changed, the nanoparticle form and the inner product are also varied, since the energetic scale is not conserved. This study is carried out using Monte Carlo simulations combined with the Wolff embedding and Metropolis algorithms. The Metropolis algorithm is used at high temperature regions to reach the equilibrium quickly. The Wolff embedding algorithm is used at low and critical temperature regions in order to reduce the critical slowing down phenomenon. The ions number is kept constant for the different forms and the critical temperatures using finite size scaling are found. We observed that critical temperatures don't exhibit significant changes when the metric space was varied. Additionally, the effective dimension according the metric space was determined. A study of static behavior for reaching the static critical exponents was developed. The objective of this work is to observe the behavior of the thermodynamic quantities as energy, magnetization, specific heat, susceptibility and Binder's cumulants at the critical region, in order to demonstrate if the magnetic nanoparticles describe their magnetic interactions in the Euclidean space or if there is any correspondence in other metric spaces.Keywords: nanoparticles, metric, Monte Carlo, critical behaviour
Procedia PDF Downloads 5167314 Effect of Implementing a Teaching Module about Diet and Exercises on Clinical Outcomes of Patients with Gout
Authors: Wafaa M. El- Kotb, Soheir Mohamed Weheida, Manal E. Fareed
Abstract:
The aim of this study was to determine the effect of implementing a teaching module about diet and exercises on clinical outcomes of patients with gout. Subjects: A purposive sample of 60 adult gouty patients was selected and randomly and alternatively divided into two equal groups 30 patients in each. Setting: The study was conducted in orthopedic out patient's clinic of Menoufia University. Tools of the study: Three tools were utilized for data collection: Knowledge assessment structured interview questionnaire, Clinical manifestation assessment tools and Nutritional assessment sheet. Results: All patients of both groups (100 %) had poor total knowledge score pre teaching, while 90 % of the study group had good total knowledge score post teaching by three months compared to 3.3 % of the control group. Moreover the recovery outcomes were significantly improved among study group compared to control group post teaching. Conclusion: Teaching study group about diet and exercises significantly improved their clinical outcomes. Recommendation: Patient's education about diet and exercises should be ongoing process for patients with gout.Keywords: clinical outcomes, diet, exercises, teaching module
Procedia PDF Downloads 3467313 A Qualitative Research of Online Fraud Decision-Making Process
Authors: Semire Yekta
Abstract:
Many online retailers set up manual review teams to overcome the limitations of automated online fraud detection systems. This study critically examines the strategies they adapt in their decision-making process to set apart fraudulent individuals from non-fraudulent online shoppers. The study uses a mix method research approach. 32 in-depth interviews have been conducted alongside with participant observation and auto-ethnography. The study found out that all steps of the decision-making process are significantly affected by a level of subjectivity, personal understandings of online fraud, preferences and judgments and not necessarily by objectively identifiable facts. Rather clearly knowing who the fraudulent individuals are, the team members have to predict whether they think the customer might be a fraudster. Common strategies used are relying on the classification and fraud scorings in the automated fraud detection systems, weighing up arguments for and against the customer and making a decision, using cancellation to test customers’ reaction and making use of personal experiences and “the sixth sense”. The interaction in the team also plays a significant role given that some decisions turn into a group discussion. While customer data represent the basis for the decision-making, fraud management teams frequently make use of Google search and Google Maps to find out additional information about the customer and verify whether the customer is the person they claim to be. While this, on the one hand, raises ethical concerns, on the other hand, Google Street View on the address and area of the customer puts customers living in less privileged housing and areas at a higher risk of being classified as fraudsters. Phone validation is used as a final measurement to make decisions for or against the customer when previous strategies and Google Search do not suffice. However, phone validation is also characterized by individuals’ subjectivity, personal views and judgment on customer’s reaction on the phone that results in a final classification as genuine or fraudulent.Keywords: online fraud, data mining, manual review, social construction
Procedia PDF Downloads 3437312 Functional Outcome and Quality of Life of Conservative versus Surgical Management of Adult Potts Disease: A Prospective Cohort Study
Authors: Mark Angelo Maranon, David Endriga
Abstract:
Objective: The aim of the study is to determine the differences in functional outcome and quality of life of adult patients with Potts disease who have undergone surgical versus non-surgical management. Methods: In this prospective cohort study, 45 patients were followed up for 1 year after undergoing pharmacologic treatment alone versus a combination of anti-Kochs and surgery for Potts disease. Oswestry Disability Index (ODI) and Short Form-36 (SF-36) were obtained on initiation of treatment, after three months, six months and one year. Results: ASIA scores from the onset of treatment and after 1 year significantly improved (p<0.001) for both non-surgical and surgical patients. ODI scores significantly improved after 6 months of treatment for both surgical and non-surgical patients. Both surgical and non-surgical patients showed significant improvement in their SF-36 scores, but scores were noted to be higher in patients who underwent surgery. Conclusions: Significant improvement with regards to functional outcome and quality of life was noted from both surgical and non-surgical patients after 1 year of treatment, with earlier improvements and better final scores in SF 36 and ODI in patients who underwent surgery.Keywords: tuberculosis, spinal, potts disease, functional outcome
Procedia PDF Downloads 1487311 Simulation-Based Optimization of a Non-Uniform Piezoelectric Energy Harvester with Stack Boundary
Authors: Alireza Keshmiri, Shahriar Bagheri, Nan Wu
Abstract:
This research presents an analytical model for the development of an energy harvester with piezoelectric rings stacked at the boundary of the structure based on the Adomian decomposition method. The model is applied to geometrically non-uniform beams to derive the steady-state dynamic response of the structure subjected to base motion excitation and efficiently harvest the subsequent vibrational energy. The in-plane polarization of the piezoelectric rings is employed to enhance the electrical power output. A parametric study for the proposed energy harvester with various design parameters is done to prepare the dataset required for optimization. Finally, simulation-based optimization technique helps to find the optimum structural design with maximum efficiency. To solve the optimization problem, an artificial neural network is first trained to replace the simulation model, and then, a genetic algorithm is employed to find the optimized design variables. Higher geometrical non-uniformity and length of the beam lowers the structure natural frequency and generates a larger power output.Keywords: piezoelectricity, energy harvesting, simulation-based optimization, artificial neural network, genetic algorithm
Procedia PDF Downloads 1237310 Technical Option Brought Solution for Safe Waste Water Management in Urban Public Toilet and Improved Ground Water Table
Authors: Chandan Kumar
Abstract:
Background and Context: Population growth and rapid urbanization resulted nearly 2 Lacs migrants along with families moving to Delhi each year in search of jobs. Most of these poor migrant families end up living in slums and constitute an estimated population of 1.87 lacs every year. Further, more than half (52 per cent) of Delhi’s population resides in places such as unauthorized and resettled colonies. Slum population is fully dependent on public toilet to defecate. In Public toilets, manholes either connected with Sewer line or septic tank. Septic tank connected public toilet faces major challenges to dispose of waste water. They have to dispose of waste water in outside open drain and waste water struck out side of public toilet complex and near to the slum area. As a result, outbreak diseases such as Malaria, Dengue and Chikungunya in slum area due to stagnated waste water. Intervention and Innovation took place by Save the Children in 21 Public Toilet Complexes of South Delhi and North Delhi. These public toilet complexes were facing same waste water disposal problem. They were disposing of minimum 1800 liters waste water every day in open drain. Which caused stagnated water-borne diseases among the nearest community. Construction of Soak Well: Construction of soak well in urban context was an innovative approach to minimizing the problem of waste water management and increased water table of existing borewell in toilet complex. This technique made solution in Ground water recharging system, and additional water was utilized in vegetable gardening within the complex premises. Soak well had constructed with multiple filter media with inlet and safeguarding bed on surrounding surface. After construction, soak well started exhausting 2000 liters of waste water to raise ground water level through different filter media. Finally, we brought a change in the communities by constructing soak well and with zero maintenance system. These Public Toilet Complexes were empowered by safe disposing waste water mechanism and reduced stagnated water-borne diseases.Keywords: diseases, ground water recharging system, soak well, toilet complex, waste water
Procedia PDF Downloads 5517309 The Hypoglycemic Grab Back (HOGG): Preparing Hypo-Screen-Bags to Streamline the Time-Consuming Process of Administering Glucose Systemic Correction
Authors: Mai Ali
Abstract:
Background: Preparing Hypo-screen-bags in advance streamlines the time-consuming process of administering glucose systemic correction. Additionally, Hypo-Screen Grab Bags are widely adopted in UK hospitals. Aim: The aim of the study is to improve hypoglycemia screening efficiency and equipment accessibility by streamlining item access to grab bag restocking staff. Methodology: The study centered on neonatal wards at LGI & St. James Neonatal Unit and related units. A web-based survey was conducted to evaluate local practices, gathering 21 responses from relevant general staff. The survey outcomes: (1) The demand for accessible grab bags is evident for smoother processes. (2) The potential to enhance efficiency through improved preparation of hypo-screen grab bags. Intervention: A Hypo-Screen Grab Bag was designed, including checklists for stocked items and required samples. Medical staff oversee restocking after use. Conclusion: The study successfully improved hypoglycemia screening efficiency and aided junior staff with accessible supplies and a user-friendly checklist.Keywords: neonatal hypoglycemia, grab bag, hypo-screening, junior staff
Procedia PDF Downloads 637308 Sexual Health And Male Fertility: Improving Sperm Health With Focus On Technology
Authors: Diana Peninger
Abstract:
Over 10% of couples in the U.S. have infertility problems, with roughly 40% traceable to the male partner. Yet, little attention has been given to improving men’s contribution to the conception process. One solution that is showing promise in increasing conception rates for IVF and other assisted reproductive technology treatments is a first-of-its-kind semen collection that has been engineered to mitigate sperm damage caused by traditional collection methods. Patients are able to collect semen at home and deliver to clinics within 48 hours for use in fertility analysis and treatment, with less stress and improved specimen viability. This abstract will share these findings along with expert insight and tips to help attendees understand the key role sperm collection plays in addressing and treating reproductive issues, while helping to improve patient outcomes and success. Our research was to determine if male reproductive outcomes can be increased by improving sperm specimen health with a focus on technology. We utilized a redesigned semen collection cup (patented as the Device for Improved Semen Collection/DISC—U.S. Patent 6864046 – known commercially as a ProteX) that met a series of physiological parameters. Previous research demonstrated significant improvement in semen perimeters (motility forward, progression, viability, and longevity) and overall sperm biochemistry when the DISC is used for collection. Animal studies have also shown dramatic increases in pregnancy rates. Our current study compares samples collected in the DISC, next-generation DISC (DISCng), and a standard specimen cup (SSC), dry, with the 1 mL measured amount of media and media in excess ( 5mL). Both human and animal testing will be included. With sperm counts declining at alarming rates due to environmental, lifestyle, and other health factors, accurate evaluations of sperm health are critical to understanding reproductive health, origins, and treatments of infertility. An increase in the health of the sperm as measured by extensive semen parameter analysis and improved semen parameters stable for 48 hours, expanding the processing time from 1 hour to 48 hours were also demonstrated.Keywords: reprodutive, sperm, male, infertility
Procedia PDF Downloads 1297307 Isolation Preserving Medical Conclusion Hold Structure via C5 Algorithm
Authors: Swati Kishor Zode, Rahul Ambekar
Abstract:
Data mining is the extraction of fascinating examples on the other hand information from enormous measure of information and choice is made as indicated by the applicable information extracted. As of late, with the dangerous advancement in internet, stockpiling of information and handling procedures, privacy preservation has been one of the major (higher) concerns in data mining. Various techniques and methods have been produced for protection saving data mining. In the situation of Clinical Decision Support System, the choice is to be made on the premise of the data separated from the remote servers by means of Internet to diagnose the patient. In this paper, the fundamental thought is to build the precision of Decision Support System for multiple diseases for different maladies and in addition protect persistent information while correspondence between Clinician side (Client side) also, the Server side. A privacy preserving protocol for clinical decision support network is proposed so that patients information dependably stay scrambled amid diagnose prepare by looking after the accuracy. To enhance the precision of Decision Support System for various malady C5.0 classifiers and to save security, a Homomorphism encryption algorithm Paillier cryptosystem is being utilized.Keywords: classification, homomorphic encryption, clinical decision support, privacy
Procedia PDF Downloads 3307306 Resource-Constrained Assembly Line Balancing Problems with Multi-Manned Workstations
Authors: Yin-Yann Chen, Jia-Ying Li
Abstract:
Assembly line balancing problems can be categorized into one-sided, two-sided, and multi-manned ones by using the number of operators deployed at workstations. This study explores the balancing problem of a resource-constrained assembly line with multi-manned workstations. Resources include machines or tools in assembly lines such as jigs, fixtures, and hand tools. A mathematical programming model was developed to carry out decision-making and planning in order to minimize the numbers of workstations, resources, and operators for achieving optimal production efficiency. To improve the solution-finding efficiency, a genetic algorithm (GA) and a simulated annealing algorithm (SA) were designed and developed in this study to be combined with a practical case in car making. Results of the GA/SA and mathematics programming were compared to verify their validity. Finally, analysis and comparison were conducted in terms of the target values, production efficiency, and deployment combinations provided by the algorithms in order for the results of this study to provide references for decision-making on production deployment.Keywords: heuristic algorithms, line balancing, multi-manned workstation, resource-constrained
Procedia PDF Downloads 2087305 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.Keywords: indexing, retrieval, multimedia, graph algorithm, graph code
Procedia PDF Downloads 1617304 Development of Nanostructrued Hydrogel for Spatial and Temporal Controlled Release of Active Compounds
Authors: Shaker Alsharif, Xavier Banquy
Abstract:
Controlled drug delivery technology represents one of the most rapidly advancing areas of science in which chemists and chemical engineers are contributing to human health care. Such delivery systems provide numerous advantages compared to conventional dosage forms including improved efficacy, and improved patient compliance and convenience. Such systems often use synthetic polymers as carriers for the drugs. As a result, treatments that would not otherwise be possible are now in conventional use. The role of bilayered vesicles as efficient carriers for drugs, vaccines, diagnostic agents and other bioactive agents have led to a rapid advancement in the liposomal drug delivery system. Moreover, the site avoidance and site-specific drug targeting therapy could be achieved by formulating a liposomal product, so as to reduce the cytotoxicity of many potent therapeutic agents. Our project focuses on developing and building hydrogel with nanoinclusion of liposomes loaded with active compounds such as proteins and growth factors able to release them in a controlled fashion. In order to achieve that, we synthesize several liposomes of two different phospholipids concentrations encapsulating model drug. Then, formulating hydrogel with specific mechanical properties embedding the liposomes to manage the release of active compound.Keywords: controlled release, hydrogel, liposomes, active compounds
Procedia PDF Downloads 4477303 The Ecosystem of Food Allergy Clinical Trials: A Systematic Review
Authors: Eimar Yadir Quintero Tapias
Abstract:
Background: Science is not generally self-correcting; many clinical studies end with the same conclusion "more research is needed." This study hypothesizes that first, we need a better appraisal of the available (and unavailable) evidence instead of creating more of the same false inquiries. Methods: Systematic review of ClinicalTrials.gov study records using the following Boolean operators: (food OR nut OR milk OR egg OR shellfish OR wheat OR peanuts) AND (allergy OR allergies OR hypersensitivity OR hypersensitivities). Variables included the status of the study (e g., active and completed), availability of results, sponsor type, sample size, among others. To determine the rates of non-publication in journals indexed by PubMed, an advanced search query using the specific Number of Clinical Trials (e.g., NCT000001 OR NCT000002 OR...) was performed. As a prophylactic measure to prevent P-hacking, data analyses only included descriptive statistics and not inferential approaches. Results: A total of 2092 study records matched the search query described above (date: September 13, 2019). Most studies were interventional (n = 1770; 84.6%) and the remainder observational (n = 322; 15.4%). Universities, hospitals, and research centers sponsored over half of these investigations (n = 1208; 57.7%), 308 studies (14.7%) were industry-funded, and 147 received NIH grants; the remaining studies got mixed sponsorship. Regarding completed studies (n = 1156; 55.2%), 248 (21.5%) have results available at the registry site, and 417 (36.1%) matched NCT numbers of journal papers indexed by PubMed. Conclusions: The internal and external validity of human research is critical for the appraisal of medical evidence. It is imperative to analyze the entire dataset of clinical studies, preferably at a patient-level anonymized raw data, before rushing to conclusions with insufficient and inadequate information. Publication bias and non-registration of clinical trials limit the evaluation of the evidence concerning therapeutic interventions for food allergy, such as oral and sublingual immunotherapy, as well as any other medical condition. Over half of the food allergy human research remains unpublished.Keywords: allergy, clinical trials, immunology, systematic reviews
Procedia PDF Downloads 1377302 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 2857301 Effects of Pulsed Electromagnetic and Static Magnetic Fields on Musculoskeletal Low Back Pain: A Systematic Review Approach
Authors: Mohammad Javaherian, Siamak Bashardoust Tajali, Monavvar Hadizadeh
Abstract:
Objective: This systematic review study was conducted to evaluate the effects of Pulsed Electromagnetic (PEMF) and Static Magnetic Fields (SMG) on pain relief and functional improvement in patients with musculoskeletal Low Back Pain (LBP). Methods: Seven electronic databases were searched by two researchers independently to identify the published Randomized Controlled Trials (RCTs) on the efficacy of pulsed electromagnetic, static magnetic, and therapeutic nuclear magnetic fields. The identified databases for systematic search were Ovid Medline®, Ovid Cochrane RCTs and Reviews, PubMed, Web of Science, Cochrane Library, CINAHL, and EMBASE from 1968 to February 2016. The relevant keywords were selected by Mesh. After initial search and finding relevant manuscripts, all references in selected studies were searched to identify second hand possible manuscripts. The published RCTs in English would be included to the study if they reported changes on pain and/or functional disability following application of magnetic fields on chronic musculoskeletal low back pain. All studies with surgical patients, patients with pelvic pain, and combination of other treatment techniques such as acupuncture or diathermy were excluded. The identified studies were critically appraised and the data were extracted independently by two raters (M.J and S.B.T). Probable disagreements were resolved through discussion between raters. Results: In total, 1505 abstracts were found following the initial electronic search. The abstracts were reviewed to identify potentially relevant manuscripts. Seventeen possibly appropriate studies were retrieved in full-text of which 48 were excluded after reviewing their full-texts. Ten selected articles were categorized into three subgroups: PEMF (6 articles), SMF (3 articles), and therapeutic nuclear magnetic fields (tNMF) (1 article). Since one study evaluated tNMF, we had to exclude it. In the PEMF group, one study of acute LBP did not show significant positive results and the majority of the other five studies on Chronic Low Back Pain (CLBP) indicated its efficacy on pain relief and functional improvement, but one study with the lowest sessions (6 sessions during 2 weeks) did not report a significant difference between treatment and control groups. In the SMF subgroup, two articles reported near significant pain reduction without any functional improvement although more studies are needed. Conclusion: The PEMFs with a strength of 5 to 150 G or 0.1 to 0.3 G and a frequency of 5 to 64 Hz or sweep 7 to 7KHz can be considered as an effective modality in pain relief and functional improvement in patients with chronic low back pain, but there is not enough evidence to confirm their effectiveness in acute low back pain. To achieve the appropriate effectiveness, it is suggested to perform this treatment modality 20 minutes per day for at least 9 sessions. SMFs have not been reported to be substantially effective in decreasing pain or improving the function in chronic low back pain. More studies are necessary to achieve more reliable results.Keywords: pulsed electromagnetic field, static magnetic field, magnetotherapy, low back pain
Procedia PDF Downloads 2057300 Minimizing Learning Difficulties in Teaching Mathematics
Authors: Hari Sharan Pandit
Abstract:
Mathematics teaching in Nepal has been centralized and guided by the notion of transfer of knowledge and skills from teachers to students. The overemphasis on an algorithm-centric approach of mathematics teaching and the focus on ‘rote–learning’ as the ultimate way of solving mathematical problems since the early years of schooling have been creating severe problems in school-level mathematics in Nepal. In this context, the author argues that students should learn real-world mathematical problems through various interesting, creative and collaborative, as well as artistic and alternative ways of knowing. The collaboration-incorporated pedagogy is an distinct pedagogical approach that offers a better alternative as an integrated and interdisciplinary approach to learning that encourages students to think more broadly and critically about real-world problems. The paper, as a summarized report of action research designed, developed and implemented by the author, focuses on the needs and usefulness of collaboration-incorporated pedagogy in the Nepali context to make mathematics teaching more meaningful for producing creative and critical citizens. This paper is useful for mathematics teachers, teacher educators and researchers who argue on arts integration in mathematics teaching.Keywords: algorithm-centric, rote-learning, collaboration - incorporated pedagogy, action research
Procedia PDF Downloads 117299 Measuring Delay Using Software Defined Networks: Limitations, Challenges, and Suggestions for Openflow
Authors: Ahmed Alutaibi, Ganti Sudhakar
Abstract:
Providing better Quality-of-Service (QoS) to end users has been a challenging problem for researchers and service providers. Building applications relying on best effort network protocols hindered the adoption of guaranteed service parameters and, ultimately, Quality of Service. The introduction of Software Defined Networking (SDN) opened the door for a new paradigm shift towards a more controlled programmable configurable behavior. Openflow has been and still is the main implementation of the SDN vision. To facilitate better QoS for applications, the network must calculate and measure certain parameters. One of those parameters is the delay between the two ends of the connection. Using the power of SDN and the knowledge of application and network behavior, SDN networks can adjust to different conditions and specifications. In this paper, we use the capabilities of SDN to implement multiple algorithms to measure delay end-to-end not only inside the SDN network. The results of applying the algorithms on an emulated environment show that we can get measurements close to the emulated delay. The results also show that depending on the algorithm, load on the network and controller can differ. In addition, the transport layer handshake algorithm performs best among the tested algorithms. Out of the results and implementation, we show the limitations of Openflow and develop suggestions to solve them.Keywords: software defined networking, quality of service, delay measurement, openflow, mininet
Procedia PDF Downloads 1657298 Just Not Seeing It: Exploring the Relationship between Inattention Blindness and Banner Blindness
Authors: Carie Cunningham, Krsiten Lynch
Abstract:
Despite a viewer’s thought that they may be paying attention, many times they are missing out on their surrounds-- a phenomenon referred to as inattentional blindness. Inattention blindness refers to the failure of an individual to orient their attention to a particular item in their visual field. This well-defined in the psychology literature. Similarly, this phenomenon has been evaluated in media types in advertising. In advertising, not comprehending/remembering items in one’s field of vision is known as banner blindness. On the other hand, banner blindness is a phenomenon that occurs when individuals habitually see a banner in a specific area on a webpage, and thus condition themselves to ignore those habitual areas. Another reason that individuals avoid these habitual areas (usually on the top or sides of a webpage) is due to the lack of personal relevance or pertinent information to the viewer. Banner blindness, while a web-based concept, may also relate this inattention blindness. This paper is proposing an analysis of the true similarities and differences between these concepts bridging the two dimensions of thinking together. Forty participants participated in an eye-tracking and post-survey experiment to test attention and memory measures in both a banner blindness and inattention blindness condition. The two conditions were conducted between subjects semi-randomized order. Half of participants were told to search through the content ignoring the advertising banners; the other half of participants were first told to search through the content ignoring the distractor icon. These groups were switched after 5 trials and then 5 more trials were completed. In review of the literature, sustainability communication was found to have many inconsistencies with message production and viewer awareness. For the purpose of this study, we used advertising materials as stimuli. Results suggest that there are gaps between the two concepts and that more research should be done testing these effects in a real world setting versus an online environment. This contributes to theory by exploring the overlapping concepts—inattention blindness and banner blindness and providing the advertising industry with support that viewers can still fall victim to ignoring items in their field of view even if not consciously, which will impact message development.Keywords: attention, banner blindness, eye movement, inattention blindness
Procedia PDF Downloads 2757297 Intelligent Control of Doubly Fed Induction Generator Wind Turbine for Smart Grid
Authors: Amal A. Hassan, Faten H. Fahmy, Abd El-Shafy A. Nafeh, Hosam K. M. Youssef
Abstract:
Due to the growing penetration of wind energy into the power grid, it is very important to study its interactions with the power system and to provide good control technique in order to deliver high quality power. In this paper, an intelligent control methodology is proposed for optimizing the controllers’ parameters of doubly fed induction generator (DFIG) based wind turbine generation system (WTGS). The genetic algorithm (GA) and particle swarm optimization (PSO) are employed and compared for the parameters adaptive tuning of the proposed proportional integral (PI) multiple controllers of the back to back converters of the DFIG based WTGS. For this purpose, the dynamic model of WTGS with DFIG and its associated controllers is presented. Furthermore, the simulation of the system is performed using MATLAB/SIMULINK and SIMPOWERSYSTEM toolbox to illustrate the performance of the optimized controllers. Finally, this work is validated to 33-bus test radial system to show the interaction between wind distributed generation (DG) systems and the distribution network.Keywords: DFIG wind turine, intelligent control, distributed generation, particle swarm optimization, genetic algorithm
Procedia PDF Downloads 2677296 A Hybrid Feature Selection Algorithm with Neural Network for Software Fault Prediction
Authors: Khalaf Khatatneh, Nabeel Al-Milli, Amjad Hudaib, Monther Ali Tarawneh
Abstract:
Software fault prediction identify potential faults in software modules during the development process. In this paper, we present a novel approach for software fault prediction by combining a feedforward neural network with particle swarm optimization (PSO). The PSO algorithm is employed as a feature selection technique to identify the most relevant metrics as inputs to the neural network. Which enhances the quality of feature selection and subsequently improves the performance of the neural network model. Through comprehensive experiments on software fault prediction datasets, the proposed hybrid approach achieves better results, outperforming traditional classification methods. The integration of PSO-based feature selection with the neural network enables the identification of critical metrics that provide more accurate fault prediction. Results shows the effectiveness of the proposed approach and its potential for reducing development costs and effort by detecting faults early in the software development lifecycle. Further research and validation on diverse datasets will help solidify the practical applicability of the new approach in real-world software engineering scenarios.Keywords: feature selection, neural network, particle swarm optimization, software fault prediction
Procedia PDF Downloads 95