Search results for: improved tabu search
6003 Improved Color-Based K-Mean Algorithm for Clustering of Satellite Image
Authors: Sangeeta Yadav, Mantosh Biswas
Abstract:
In this paper, we proposed an improved color based K-mean algorithm for clustering of satellite Image (SAR). Our method comprises of two stages. The first step is an interactive selection process where users are required to input the number of colors (ncolor), number of clusters, and then they are prompted to select the points in each color cluster. In the second step these points are given as input to K-mean clustering algorithm that clusters the image based on color and Minimum Square Euclidean distance. The proposed method reduces the mixed pixel problem to a great extent.Keywords: cluster, ncolor method, K-mean method, interactive selection process
Procedia PDF Downloads 2976002 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method
Authors: Arwa Alzughaibi
Abstract:
Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization
Procedia PDF Downloads 2576001 Time-Domain Simulations of the Coupled Dynamics of Surface Riding Wave Energy Converter
Authors: Chungkuk Jin, Moo-Hyun Kim, HeonYong Kang
Abstract:
A surface riding (SR) wave energy converter (WEC) is designed and its feasibility and performance are numerically simulated by the author-developed floater-mooring-magnet-electromagnetics fully-coupled dynamic analysis computer program. The biggest advantage of the SR-WEC is that the performance is equally effective even in low sea states and its structural robustness is greatly improved by simply riding along the wave surface compared to other existing WECs. By the numerical simulations and actuator testing, it is clearly demonstrated that the concept works and through the optimization process, its efficiency can be improved.Keywords: computer simulation, electromagnetics fully-coupled dynamics, floater-mooring-magnet, optimization, performance evaluation, surface riding, WEC
Procedia PDF Downloads 1456000 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals
Authors: Bahareh Ansari
Abstract:
Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.Keywords: best practices, data visualization, literature review, open government data
Procedia PDF Downloads 1055999 Phyllantus nuriri Protect against Fe2+ and SNP Induced Oxidative Damage in Mitochondrial Rich Fractions of Rats Brain
Authors: Olusola Olalekan Elekofehinti, Isaac Gbadura Adanlawo, Joao Batista Teixeira Rocha
Abstract:
We evaluated the potential neuroprotective effect of Phyllantus nuriri against Fe2+ and SNP induced oxidative stress in mitochondria of rats brain. Cellular viability was assessed by MTT reduction, reactive oxygen species (ROS) generation was measured using the probe 2,7-dichlorofluorescein diacetate (DCFH-DA). Glutathione content was measured using dithionitrobenzoic acid (DTNB). Fe2+ (10µM) and SNP (5µM) significantly decreased mitochondrial activity, assessed by MTT reduction assay, in a dose-dependent manner, this occurred in parallel with increased glutathione oxidation, ROS production and lipid peroxidation end-products (thiobarbituric acid reactive substances, TBARS). The co-incubation with methanolic extract of Phyllantus nuriri (10-100 µg/ml) reduced the disruption of mitochondrial activity, gluthathione oxidation, ROS production as well as the increase in TBARS levels caused by both Fe2+ and SNP in a dose dependent manner. HPLC analysis of the extract revealed the presence of gallic acid (20.54±0.01), caffeic acid (7.93±0.02), rutin (25.31±0.05), quercetin (31.28±0.03) and kaemferol (14.36±0.01). This result suggests that these phytochemicals account for the protective actions of Phyllantus nuriri against Fe2+ and SNP -induced oxidative stress. Our results show that Phyllantus nuriri consist important bioactive molecules in the search for an improved therapy against the deleterious effects of Fe2+, an intrinsic producer of reactive oxygen species (ROS), that leads to neuronal oxidative stress and neurodegeneration.Keywords: Phyllantus niruri, neuroprotection, oxidative stress, mitochondria, synaptosome
Procedia PDF Downloads 3595998 Strength and Permeability of the Granular Pavement Materials Treated with Polyacrylamide Based Additive
Authors: Romel N. Georgees, Rayya A Hassan, Robert P. Evans, Piratheepan Jegatheesan
Abstract:
Among other traditional and non-traditional additives, polymers have shown an efficient performance in the field and improved sustainability. Polyacrylamide (PAM) is one such additive that has demonstrated many advantages including a reduction in permeability, an increase in durability and the provision of strength characteristics. However, information about its effect on the improved geotechnical characteristics is very limited to the field performance monitoring. Therefore, a laboratory investigation was carried out to examine the basic and engineering behaviors of three types of soils treated with a PAM additive. The results showed an increase in dry density and unconfined compressive strength for all the soils. The results further demonstrated an increase in unsoaked CBR and a reduction in permeability for all stabilized samples.Keywords: CBR, hydraulic conductivity, PAM, unconfined compressive strength
Procedia PDF Downloads 3745997 Application of Bioreactors in Regenerative Dentistry: Literature Review
Authors: Neeraj Malhotra
Abstract:
Background: Bioreactors in tissue engineering are used as devices that apply mechanical means to influence biological processes. They are commonly employed for stem cell culturing, growth and expansion as well as in 3D tissue culture. Contemporarily there use is well established and is tested extensively in the medical sciences, for tissue-regeneration and tissue engineering of organs like bone, cartilage, blood vessels, skin grafts, cardiac muscle etc. Methodology: Literature search, both electronic and hand search, was done using the following MeSH and keywords: bioreactors, bioreactors and dentistry, bioreactors & dental tissue engineering, bioreactors and regenerative dentistry. Articles published only in English language were included for review. Results: Bioreactors like, spinner flask-, rotating wall-, flow perfusion-, and micro-bioreactors and in-vivo bioreactor have been employed and tested for the regeneration of dental and like-tissues. These include gingival tissue, periodontal ligament, alveolar bone, mucosa, cementum and blood vessels. Based on their working dynamics they can be customized in future for regeneration of pulp tissue and whole tooth regeneration. Apart from this, they have been successfully used in testing the clinical efficacy and biological safety of dental biomaterials. Conclusion: Bioreactors have potential use in testing dental biomaterials and tissue engineering approaches aimed at regenerative dentistry.Keywords: bioreactors, biological process, mechanical stimulation, regenerative dentistry, stem cells
Procedia PDF Downloads 2095996 Debriefing Practices and Models: An Integrative Review
Authors: Judson P. LaGrone
Abstract:
Simulation-based education in curricula was once a luxurious component of nursing programs but now serves as a vital element of an individual’s learning experience. A debriefing occurs after the simulation scenario or clinical experience is completed to allow the instructor(s) or trained professional(s) to act as a debriefer to guide a reflection with a purpose of acknowledging, assessing, and synthesizing the thought process, decision-making process, and actions/behaviors performed during the scenario or clinical experience. Debriefing is a vital component of the simulation process and educational experience to allow the learner(s) to progressively build upon past experiences and current scenarios within a safe and welcoming environment with a guided dialog to enhance future practice. The aim of this integrative review was to assess current practices of debriefing models in simulation-based education for health care professionals and students. The following databases were utilized for the search: CINAHL Plus, Cochrane Database of Systemic Reviews, EBSCO (ERIC), PsycINFO (Ovid), and Google Scholar. The advanced search option was useful to narrow down the search of articles (full text, Boolean operators, English language, peer-reviewed, published in the past five years). Key terms included debrief, debriefing, debriefing model, debriefing intervention, psychological debriefing, simulation, simulation-based education, simulation pedagogy, health care professional, nursing student, and learning process. Included studies focus on debriefing after clinical scenarios of nursing students, medical students, and interprofessional teams conducted between 2015 and 2020. Common themes were identified after the analysis of articles matching the search criteria. Several debriefing models are addressed in the literature with similarities of effectiveness for participants in clinical simulation-based pedagogy. Themes identified included (a) importance of debriefing in simulation-based pedagogy, (b) environment for which debriefing takes place is an important consideration, (c) individuals who should conduct the debrief, (d) length of debrief, and (e) methodology of the debrief. Debriefing models supported by theoretical frameworks and facilitated by trained staff are vital for a successful debriefing experience. Models differed from self-debriefing, facilitator-led debriefing, video-assisted debriefing, rapid cycle deliberate practice, and reflective debriefing. A reoccurring finding was centered around the emphasis of continued research for systematic tool development and analysis of the validity and effectiveness of current debriefing practices. There is a lack of consistency of debriefing models among nursing curriculum with an increasing rate of ill-prepared faculty to facilitate the debriefing phase of the simulation.Keywords: debriefing model, debriefing intervention, health care professional, simulation-based education
Procedia PDF Downloads 1425995 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation
Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski
Abstract:
In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming
Procedia PDF Downloads 4065994 Farmer-Participatory Variety Trials for Tomato and Chili Pepper in East Java
Authors: Hanik Anggraeni, Evy Latifah, Putu Bagus, Joko Mariyono
Abstract:
This study is to test the adaptation capacity of several selected lines and varieties of chili and tomato in farmers’ lands. Five improved lines and varieties of tomato and chili were selected based on the best performance in previous trials. Two participating farmers managed the trials. Agronomic aspects were used as performance indicators. The results show that several improved lines of tomato and chili performed better than others. However, the performance was dependent on the altitude and season. Lines performed better and high altitude could not do the same in low altitude, and vice versa. This is the same case as different season. Farmers were expected to select the best lines according to the locations.Keywords: variety trials, tomato and chili, participatory farmers, East Java
Procedia PDF Downloads 2345993 Fabrication of Gold Nanoparticles Self-Assembled Functionalized Improved Graphene on Carbon Paste Electrode for Electrochemical Determination of Levodopa in the Presence of Ascorbic Acid
Authors: Mohammad Ali Karimi, Hossein Tavallali, Abdolhamid Hatefi-Mehrjardi
Abstract:
In this study, an electrochemical sensor based on gold nanoparticles (AuNPs) functionalized improved graphene (AuNPs-IGE) was fabricated for selective determination of L-dopa in the presence of ascorbic acid by a novel self-assembly method. The AuNP IGE modified carbon paste electrode (AuNPs-IGE/CPE) utilized for investigation of the electrochemical behavior of L-dopa in phosphate buffer solution. Compared to bare CPE, AuNPs-IGE/CPE shows novel properties towards the electrochemical redox of levodopa (L-dopa) in phosphate buffer solution at pH 4.0. The oxidation potential of L-dopa shows a significant decrease at the AuNPs-IGE/CPE. The oxidation current of L-dopa is higher than that of the unmodified CPE. AuNPs-IG/CPE shows excellent electrocatalytic activity for the oxidation of ascorbic acid (AA). Using differential pulse voltammetry (DPV) method, the oxidation current is well linear with L-dopa concentration in the range of 0.4–50 µmol L-1, with a detection limit of about 1.41 nmol L-1 (S/N = 3). Therefore, it was applied to measure L-dopa from real samples that recoveries are 94.6-106.2%. The proposed electrode can also effectively avoid the interference of ascorbic acid, making the proposed sensor suitable for the accurate determination of L-dopa in both pharmaceutical preparations and human body fluids.Keywords: gold nanoparticles, improved graphene, L-dopa, self-assembly
Procedia PDF Downloads 2215992 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection
Authors: Devadrita Dey Sarkar
Abstract:
Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)
Procedia PDF Downloads 4565991 Measuring the Height of a Person in Closed Circuit Television Video Footage Using 3D Human Body Model
Authors: Dojoon Jung, Kiwoong Moon, Joong Lee
Abstract:
The height of criminals is one of the important clues that can determine the scope of the suspect's search or exclude the suspect from the search target. Although measuring the height of criminals by video alone is limited by various reasons, the 3D data of the scene and the Closed Circuit Television (CCTV) footage are matched, the height of the criminal can be measured. However, it is still difficult to measure the height of CCTV footage in the non-contact type measurement method because of variables such as position, posture, and head shape of criminals. In this paper, we propose a method of matching the CCTV footage with the 3D data on the crime scene and measuring the height of the person using the 3D human body model in the matched data. In the proposed method, the height is measured by using 3D human model in various scenes of the person in the CCTV footage, and the measurement value of the target person is corrected by the measurement error of the replay CCTV footage of the reference person. We tested for 20 people's walking CCTV footage captured from an indoor and an outdoor and corrected the measurement values with 5 reference persons. Experimental results show that the measurement error (true value-measured value) average is 0.45 cm, and this method is effective for the measurement of the person's height in CCTV footage.Keywords: human height, CCTV footage, 2D/3D matching, 3D human body model
Procedia PDF Downloads 2485990 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm
Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta
Abstract:
Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates
Procedia PDF Downloads 2375989 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).Keywords: interoperability, interoperability maturity model, school management system, scoping review
Procedia PDF Downloads 2095988 Geospatial Techniques for Impact Assessment of Canal Rehabilitation Program in Sindh, Pakistan
Authors: Sumaira Zafar, Arjumand Zaidi, Muhammad Arslan Hafeez
Abstract:
Indus Basin Irrigation System (IBIS) is the largest contiguous irrigation system of the world comprising Indus River and its tributaries, canals, distributaries, and watercourses. A big challenge faced by IBIS is transmission losses through seepage and leaks that account to 41 percent of the total water derived from the river and about 40 percent of that is through watercourses. Irrigation system rehabilitation programs in Pakistan are focused on improvement of canal system at the watercourse level (tertiary channels). Under these irrigation system management programs more than 22,800 watercourses have been improved or lined out of 43,000 (12,900 Kilometers) watercourses. The evaluation of the improvement work is required at this stage to testify the success of the programs. In this paper, emerging technologies of GIS and satellite remote sensing are used for impact assessment of watercourse rehabilitation work in Sindh. To evaluate the efficiency of the improved watercourses, few parameters are selected like soil moisture along watercourses, availability of water at tail end and changes in cultivable command areas. Improved watercourses details and maps are acquired from National Program for Improvement of Watercourses (NPIW) and Space and Upper Atmospheric Research Commission (SUPARCO). High resolution satellite images of Google Earth for the year of 2004 to 2013 are used for digitizing command areas. Temporal maps of cultivable command areas show a noticeable increase in the cultivable land served by improved watercourses. Field visits are conducted to validate the results. Interviews with farmers and landowners also reveal their overall satisfaction in terms of availability of water at the tail end and increased crop production.Keywords: geospatial, impact assessment, watercourses, GIS, remote sensing, seepage, canal lining
Procedia PDF Downloads 3515987 Productivity and Household Welfare Impact of Technology Adoption: A Microeconometric Analysis
Authors: Tigist Mekonnen Melesse
Abstract:
Since rural households are basically entitled to food through own production, improving productivity might lead to enhance the welfare of rural population through higher food availability at the household level and lowering the price of agricultural products. Increasing agricultural productivity through the use of improved technology is one of the desired outcomes from sensible food security and agricultural policy. The ultimate objective of this study was to evaluate the potential impact of improved agricultural technology adoption on smallholders’ crop productivity and welfare. The study is conducted in Ethiopia covering 1500 rural households drawn from four regions and 15 rural villages based on data collected by Ethiopian Rural Household Survey. Endogenous treatment effect model is employed in order to account for the selection bias on adoption decision that is expected from the self-selection of households in technology adoption. The treatment indicator, technology adoption is a binary variable indicating whether the household used improved seeds and chemical fertilizer or not. The outcome variables were cereal crop productivity, measured in real value of production and welfare of households, measured in real per capita consumption expenditure. Results of the analysis indicate that there is positive and significant effect of improved technology use on rural households’ crop productivity and welfare in Ethiopia. Adoption of improved seeds and chemical fertilizer alone will increase the crop productivity by 7.38 and 6.32 percent per year of each. Adoption of such technologies is also found to improve households’ welfare by 1.17 and 0.25 percent per month of each. The combined effect of both technologies when adopted jointly is increasing crop productivity by 5.82 percent and improving welfare by 0.42 percent. Besides, educational level of household head, farm size, labor use, participation in extension program, expenditure for input and number of oxen positively affect crop productivity and household welfare, while large household size negatively affect welfare of households. In our estimation, the average treatment effect of technology adoption (average treatment effect on the treated, ATET) is the same as the average treatment effect (ATE). This implies that the average predicted outcome for the treatment group is similar to the average predicted outcome for the whole population.Keywords: Endogenous treatment effect, technologies, productivity, welfare, Ethiopia
Procedia PDF Downloads 6555986 Two-Dimensional WO₃ and TiO₂ Semiconductor Oxides Developed by Atomic Layer Deposition with Controllable Nano-Thickness on Wafer-Scale
Authors: S. Zhuiykov, Z. Wei
Abstract:
Conformal defect-free two-dimensional (2D) WO₃ and TiO₂ semiconductors have been developed by the atomic layer deposition (ALD) technique on wafer scale with unique approach to the thickness control with precision of ± 10% from the monolayer of nanomaterial (less than 1.0 nm thick) to the nano-layered 2D structures with thickness of ~3.0-7.0 nm. Developed 2D nanostructures exhibited unique, distinguishable properties at nanoscale compare to their thicker counterparts. Specifically, 2D TiO₂-Au bilayer demonstrated improved photocatalytic degradation of palmitic acid under UV and visible light illumination. Improved functional capabilities of 2D semiconductors would be advantageous to various environmental, nano-energy and bio-sensing applications. The ALD-enabled approach is proven to be versatile, scalable and applicable to the broader range of 2D semiconductors.Keywords: two-dimensional (2D) semiconductors, ALD, WO₃, TiO₂, wafer scale
Procedia PDF Downloads 1535985 Graphitic Carbon Nitride-CeO₂ Nanocomposite for Photocatalytic Degradation of Methyl Red
Authors: Khansaa Al-Essa
Abstract:
Nanosized ceria (CeO₂) and graphitic carbon nitride-loaded ceria (CeO₂/GCN) nanocomposite have been synthesized by the coprecipitation method and studied its photocatalytic activity for methyl red degradation under Visible type radiation. A phase formation study was carried out by using an x-ray diffraction technique, and it revealed that ceria (CeO₂) is properly supported on the surface of GCN. Ceria nanoparticles and CeO₂/GCN nanocomposite were confirmed by transmission electron microscopy technique. The particle size of the CeO₂, CeO₂/GCN nanocomposite is in the range of 10-15 nm. Photocatalytic activity of the CeO₂/g-C3N4 composite was improved as compared to CeO₂. The enhanced photocatalytic activity is attributed to the increased visible light absorption and improved adsorption of the dye on the surface of the composite catalyst.Keywords: photodegradation, dye, nanocomposite, graphitic carbon nitride-CeO₂
Procedia PDF Downloads 205984 The Role of Metaheuristic Approaches in Engineering Problems
Authors: Ferzat Anka
Abstract:
Many types of problems can be solved using traditional analytical methods. However, these methods take a long time and cause inefficient use of resources. In particular, different approaches may be required in solving complex and global engineering problems that we frequently encounter in real life. The bigger and more complex a problem, the harder it is to solve. Such problems are called Nondeterministic Polynomial time (NP-hard) in the literature. The main reasons for recommending different metaheuristic algorithms for various problems are the use of simple concepts, the use of simple mathematical equations and structures, the use of non-derivative mechanisms, the avoidance of local optima, and their fast convergence. They are also flexible, as they can be applied to different problems without very specific modifications. Thanks to these features, it can be easily embedded even in many hardware devices. Accordingly, this approach can also be used in trend application areas such as IoT, big data, and parallel structures. Indeed, the metaheuristic approaches are algorithms that return near-optimal results for solving large-scale optimization problems. This study is focused on the new metaheuristic method that has been merged with the chaotic approach. It is based on the chaos theorem and helps relevant algorithms to improve the diversity of the population and fast convergence. This approach is based on Chimp Optimization Algorithm (ChOA), that is a recently introduced metaheuristic algorithm inspired by nature. This algorithm identified four types of chimpanzee groups: attacker, barrier, chaser, and driver, and proposed a suitable mathematical model for them based on the various intelligence and sexual motivations of chimpanzees. However, this algorithm is not more successful in the convergence rate and escaping of the local optimum trap in solving high-dimensional problems. Although it and some of its variants use some strategies to overcome these problems, it is observed that it is not sufficient. Therefore, in this study, a newly expanded variant is described. In the algorithm called Ex-ChOA, hybrid models are proposed for position updates of search agents, and a dynamic switching mechanism is provided for transition phases. This flexible structure solves the slow convergence problem of ChOA and improves its accuracy in multidimensional problems. Therefore, it tries to achieve success in solving global, complex, and constrained problems. The main contribution of this study is 1) It improves the accuracy and solves the slow convergence problem of the ChOA. 2) It proposes new hybrid movement strategy models for position updates of search agents. 3) It provides success in solving global, complex, and constrained problems. 4) It provides a dynamic switching mechanism between phases. The performance of the Ex-ChOA algorithm is analyzed on a total of 8 benchmark functions, as well as a total of 2 classical and constrained engineering problems. The proposed algorithm is compared with the ChoA, and several well-known variants (Weighted-ChoA, Enhanced-ChoA) are used. In addition, an Improved algorithm from the Grey Wolf Optimizer (I-GWO) method is chosen for comparison since the working model is similar. The obtained results depict that the proposed algorithm performs better or equivalently to the compared algorithms.Keywords: optimization, metaheuristic, chimp optimization algorithm, engineering constrained problems
Procedia PDF Downloads 775983 Interoperability Model Design of Smart Grid Power System
Authors: Seon-Hack Hong, Tae-Il Choi
Abstract:
Interoperability is defined as systems, components, and devices developed by different entities smoothly exchanging information and functioning organically without mutual consultation, being able to communicate with each other and computer systems of the same type or different types, and exchanging information or the ability of two or more systems to exchange information and use the information exchanged without extra effort. Insufficiencies such as duplication of functions when developing systems and applications due to lack of interoperability in the electric power system and low efficiency due to a lack of mutual information transmission system between the inside of the application program and the design is improved, and the seamless linkage of newly developed systems is improved. Since it is necessary to secure interoperability for this purpose, we designed the smart grid-based interoperability standard model in this paper.Keywords: interoperability, power system, common information model, SCADA, IEEE2030, Zephyr
Procedia PDF Downloads 1245982 Improved Impossible Differential Cryptanalysis of Midori64
Authors: Zhan Chen, Wenquan Bi, Xiaoyun Wang
Abstract:
The Midori family of light weight block cipher is proposed in ASIACRYPT2015. It has attracted the attention of numerous cryptanalysts. There are two versions of Midori: Midori64 which takes a 64-bit block size and Midori128 the size of which is 128-bit. In this paper an improved 10-round impossible differential attack on Midori64 is proposed. Pre-whitening keys are considered in this attack. A better impossible differential path is used to reduce time complexity by decreasing the number of key bits guessed. A hash table is built in the pre-computation phase to reduce computational complexity. Partial abort technique is used in the key seiving phase. The attack requires 259 chosen plaintexts, 214.58 blocks of memory and 268.83 10-round Midori64 encryptions.Keywords: cryptanalysis, impossible differential, light weight block cipher, Midori
Procedia PDF Downloads 3485981 Improved Mechanical and Electrical Properties and Thermal Stability of Post-Consumer Polyethylene Terephthalate Glycol Containing Hybrid System of Nanofillers
Authors: Iman Taraghi, Sandra Paszkiewicz, Daria Pawlikowska, Anna Szymczyk, Izabela Irska, Rafal Stanik, Amelia Linares, Tiberio A. Ezquerra, Elżbieta Piesowicz
Abstract:
Currently, the massive use of thermoplastic materials in industrial applications causes huge amounts of polymer waste. The poly (ethylene glycol-co-1,4-cyclohexanedimethanol terephthalate) (PET-G) has been widely used in food packaging and polymer foils. In this research, the PET-G foils have been recycled and reused as a matrix to combine with different types of nanofillers such as carbon nanotubes, graphene nanoplatelets, and nanosized carbon black. The mechanical and electrical properties, as well as thermal stability and thermal conductivity of the PET-G, improved along with the addition of the aforementioned nanofillers and hybrid system of them.Keywords: polymer hybrid nanocomposites, carbon nanofillers, recycling, physical performance
Procedia PDF Downloads 1365980 Load Characteristics of Improved Howland Current Pump for Bio-Impedance Measurement
Authors: Zhao Weijie, Lin Xinjian, Liu Xiaojuan, Li Lihua
Abstract:
The Howland current pump is widely used in bio-impedance measurement. Much attention has been focused on the output impedance of the Howland circuit. Here we focus on the maximum load of the Howland source and discuss the relationship between the circuit parameters at maximum load. We conclude that the signal input terminal of the feedback resistor should be as large as possible, but that the current-limiting resistor should be smaller. The op-amp saturation voltage should also be high. The bandwidth of the circuit is proportional to the bandwidth of the op-amp. The Howland current pump was simulated using multisim12. When the AD8066AR was selected as the op-amp, the maximum load was 11.5 kΩ, and the Howland current pump had a stable output ipp to 2mAp up to 200 kHz. However, with an OPA847 op-amp and a load of 6.3 kΩ, the output current was also stable, and the frequency was as high as 3 MHz.Keywords: bio-impedance, improved Howland current pump, load characteristics, bioengineering
Procedia PDF Downloads 5135979 Modification of Four Layer through the Thickness Woven Structure for Improved Impact Resistance
Authors: Muhammad Liaqat, Hafiz Abdul Samad, Syed Talha Ali Hamdani, Yasir Nawab
Abstract:
In the current research, the four layers, orthogonal through the thickness, 2D woven, 3D fabric structure was modified to improve the impact resistance of 3D fabric reinforced composites. This was achieved by imparting the auxeticity into four layers through the thickness woven structure. A comparison was made between the standard and modified four layers through the thickness woven structure in terms of auxeticity, penetration and impact resistance. It was found that the modified structure showed auxeticity in both warp and weft direction. It was also found that the penetration resistance of modified sample was less as compared to the standard structure, but impact resistance was improved up to 6.7% of modified four layers through the thickness woven structure.Keywords: 2D woven, 3D fabrics, auxetic, impact resistance, orthogonal through the thickness
Procedia PDF Downloads 3375978 Farmers’ Awareness of Pillars of Planting for Food and Jobs Programme in Ghana
Authors: Franklin Nantui Mabe, Gideon Danso-Abbeam, Dennis Sedem Ehiakpor
Abstract:
In order for the government of Ghana through the Ministry of Food and Agriculture to motivate farmers to adopt improved agricultural technologies, expand their farms and encourage youth to enter into agricultural production so as to increase crop productivity, “Planting for Food and Jobs” (PFJ) programme was launched in April 2017. The PFJ programme covers five pillars, namely, provision of subsidized and improved seeds; subsidized fertilizer; agricultural extension services; establishment of markets; and e-agriculture. This study assesses the awareness of farmers about the packages of these pillars using the Likert scale, paired t-test and Spearman’s rank correlation coefficient. The study adopted a mixed research design. A semi-structured questionnaire and checklist were used to collect data. The data collection was done using interviews and focus group discussions. The PFJ pillar farmers are much aware is a subsidy on fertilizer followed by a subsidy on improved seeds. Electronic agriculture is a pillar with the lowest level of awareness. There is a strong positive correlation between awareness of fertilizer and seed packages suggestion their complementarities. Lack of information/awareness of the packages of the programme can affect farmers’ participation in all the pillars. Farmers, in particular, should be educated for them to know what they are entitled to in each of the pillars. The programme implementation plan should also be made available to farmers as a guide.Keywords: awareness, planting for food and jobs, programme, farmers, likert scale
Procedia PDF Downloads 2315977 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct
Procedia PDF Downloads 2255976 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2725975 A Qualitative Review and Meta-Analyses of Published Literature Exploring Rates and Reasons Behind the Choice of Elective Caesarean Section in Pregnant Women With No Contraindication to Trial of Labor After One Previous Caesarean Section
Authors: Risheka Suthantirakumar, Eilish Pearson, Jacqueline Woodman
Abstract:
Background: Previous research has found a variety of rates and reasons for choosing medically unindicated elective repeat cesarean section (ERCS). Understanding the frequency and reasoning of ERCS, especially when unwarranted, could help healthcare professionals better tailor their advice and service. Therefore, our study conducted meta-analyses and qualitative analyses to identify the reasons and rates worldwide for choosing this procedure over the trial of labor after cesarean (TOLAC), also referred to in published literature as vaginal birth after cesarean (VBAC). Methods: We conducted a systematic review of published literature available on PubMed, EMBASE, and science.gov and conducted a blinded peer review process to assess eligibility. Search terms were created in collaboration with experts in the field. An inclusion and exclusion criteria were established prior to reviewing the articles. Included studies were limited to those published in English due to author constraints, although no international boundaries were used in the search. No time limit for the search was used in order to portray changes over time. Results: Our qualitative analyses found five consistent themes across international studies, which were socioeconomic and cultural differences, previous cesarean experience, perceptions of risk with vaginal birth, patients’ perceptions of future benefits, and medical advice and information. Our meta-analyses found variable rates of ERCS across international borders and within national populations. The average rate across all studies was 44% (CI 95% 36-51). Discussion: The studies included in our qualitative analysis demonstrated similar repetitive themes, which give validity to the findings across the studies included. We consider the rate variation across and within national populations to be partially a result of differing inclusion and eligibility assessment between different studies and argue that a proforma be utilized for future research to be comparable.Keywords: elective cesarean section, VBAC, TOLAC, maternal choice
Procedia PDF Downloads 1115974 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints
Authors: Maxime Merheb, Rachel Matar
Abstract:
Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR
Procedia PDF Downloads 400