Search results for: iterative methods
14818 Determination of Biological Efficiency Values of Some Pesticide Application Methods under Second Crop Maize Conditions
Authors: Ali Bolat, Ali Bayat, Mustafa Gullu
Abstract:
Maize can be cultivated both under main and second crop conditions in Turkey. Main pests of maize under second crop conditions are Sesamia nonagrioides Lefebvre (Lepidoptera: Noctuidae) and Ostrinia nubilalis Hübner (Lepidoptera: Crambidae). Aerial spraying applications to control these two main maize pests can be carried out until 2006 in Turkey before it was banned due to environmental concerns like drifting of sprayed pestisides and low biological efficiency. In this context, pulverizers which can spray tall maize plants ( > 175 cm) from the ground have begun to be used. However, the biological efficiency of these sprayers is unknown. Some methods have been tested to increase the success of ground spraying in field experiments conducted in second crop maize in 2008 and 2009. For this aim, 6 spraying methods (air assisted spraying with TX cone jet, domestic cone nozzles, twinjet nozzles, air induction nozzles, standard domestic cone nozzles and tail booms) were used at two application rates (150 and 300 l.ha-1) by a sprayer. In the study, biological efficacy evaluations of each methods were measured in each parcel. Biological efficacy evaluations included counts of number of insect damaged plants, number of holes in stems and live larvae and pupa in stems of selected plants. As a result, the highest biological efficacy value (close to 70%) was obtained from Air Assisted Spraying method at 300 l / ha application volume.Keywords: air assisted sprayer, drift nozzles, biological efficiency, maize plant
Procedia PDF Downloads 21314817 Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset
Authors: Essam Al Daoud
Abstract:
Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.Keywords: gradient boosting, XGBoost, LightGBM, CatBoost, home credit
Procedia PDF Downloads 17114816 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods
Authors: Amare Setegn Enyew, Bikila Teklu Wodajo
Abstract:
The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA
Procedia PDF Downloads 6314815 Virtual Customer Integration in Innovation Development: A Systematic Literature Review
Authors: Chau Nguyen Pham Minh
Abstract:
The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.Keywords: innovation, virtual customer integration, co-creation, netnography, new product development
Procedia PDF Downloads 33614814 Interior Design: Changing Values
Authors: Kika Ioannou Kazamia
Abstract:
This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.Keywords: design, change, sustainability, learning, practices
Procedia PDF Downloads 7714813 Cultural Embeddedness of E-Participation Methods in Hungary
Authors: Hajnalka Szarvas
Abstract:
The research examines the effectiveness of e-participation tools and methods from a point of view of cultural fitting to the Hungarian community traditions. Participation can have very different meanings depending on the local cultural and historical traditions, experiences of the certain societies. Generally when it is about e-democracy or e-participation tools most of the researches are dealing with its technological sides and novelties, but there is not much said about the cultural and social context of the different platforms. However from the perspective of their success it would be essential to look at the human factor too, the actual users, how the certain DMS or any online platform is fitting to the way of thought, the way of functioning of the certain society. Therefore the paper will explore that to what extent the different online platforms like Loomio, Democracy OS, Your Priorities EVoks, Populus, miutcank.hu, Liquid Democracy, Brain Bar Budapest Lab are compatible with the Hungarian mental structures and community traditions, the contents of collective mind about community functioning. As a result the influence of cultural embeddedness of the logic of e-participation development tools on success of these methods will be clearly seen. Furthermore the most crucial factors in general which determine the efficiency of e-participation development tools in Hungary will be demonstrated.Keywords: cultural embeddedness, e-participation, local community traditions, mental structures
Procedia PDF Downloads 30314812 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector
Procedia PDF Downloads 33514811 Protein Remote Homology Detection by Using Profile-Based Matrix Transformation Approaches
Authors: Bin Liu
Abstract:
As one of the most important tasks in protein sequence analysis, protein remote homology detection has been studied for decades. Currently, the profile-based methods show state-of-the-art performance. Position-Specific Frequency Matrix (PSFM) is widely used profile. However, there exists noise information in the profiles introduced by the amino acids with low frequencies. In this study, we propose a method to remove the noise information in the PSFM by removing the amino acids with low frequencies called Top frequency profile (TFP). Three new matrix transformation methods, including Autocross covariance (ACC) transformation, Tri-gram, and K-separated bigram (KSB), are performed on these profiles to convert them into fixed length feature vectors. Combined with Support Vector Machines (SVMs), the predictors are constructed. Evaluated on two benchmark datasets, and experimental results show that these proposed methods outperform other state-of-the-art predictors.Keywords: protein remote homology detection, protein fold recognition, top frequency profile, support vector machines
Procedia PDF Downloads 12514810 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System
Authors: Tahsin A. H. Nishat, Raquib Ahsan
Abstract:
Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system
Procedia PDF Downloads 13714809 Learning Fashion Construction and Manufacturing Methods from the Past: Cultural History and Genealogy at the Middle Tennessee State University Historic Clothing Collection
Authors: Teresa B. King
Abstract:
In the millennial age, with more students desiring a fashion major yet fewer having sewing and manufacturing knowledge, this increases demand on academicians to adequately educate. While fashion museums have a prominent place for historical preservation, the need for apparel education via working collections of handmade or mass manufactured apparel is lacking in most universities in the United States, especially in the Southern region. Created in 1988, Middle Tennessee State University’s historic clothing collection provides opportunities to study apparel construction methods throughout history, to compare and apply to today’s construction and manufacturing methods, as well as to learn the cyclical nature/importance of historic styles on current and upcoming fashion. In 2019, a class exercise experiment was implemented for which students researched their family genealogy using Ancestry.com, identified the oldest visual media (photographs, etc.) available, and analyzed the garment represented in said media. The student then located a comparable garment in the historic collection and evaluated the construction methods of the ancestor’s time period. A class 'fashion' genealogy tree was created and mounted for public viewing/education. Results of this exercise indicated that student learning increased due to the 'personal/familial connection' as it triggered more interest in historical garments as related to the student’s own personal culture. Students better identified garments regarding the historical time period, fiber content, fabric, and construction methods utilized, thus increasing learning and retention. Students also developed increased learning and recognition of custom construction methods versus current mass manufacturing techniques, which impact today’s fashion industry. A longitudinal effort will continue with the growth of the historic collection and as students continue to utilize the historic clothing collection.Keywords: ancestry, clothing history, fashion history, genealogy, historic fashion museum collection
Procedia PDF Downloads 13614808 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint
Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume
Abstract:
Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.Keywords: biodiversity, companies, footprint, life cycle assessment, products
Procedia PDF Downloads 32714807 Risk Measure from Investment in Finance by Value at Risk
Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji
Abstract:
Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk
Procedia PDF Downloads 44114806 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 1914805 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps
Authors: Yong Bum Shin
Abstract:
This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process
Procedia PDF Downloads 8114804 Using Technology to Deliver and Scale Early Childhood Development Services in Resource Constrained Environments: Case Studies from South Africa
Authors: Sonja Giese, Tess N. Peacock
Abstract:
South African based Innovation Edge is experimenting with technology to drive positive behavior change, enable data-driven decision making, and scale quality early years services. This paper uses five case studies to illustrate how technology can be used in resource-constrained environments to first, encourage parenting practices that build early language development (using a stage-based mobile messaging pilot, ChildConnect), secondly, to improve the quality of ECD programs (using a mobile application, CareUp), thirdly, how to affordably scale services for the early detection of visual and hearing impairments (using a mobile tool, HearX), fourthly, how to build a transparent and accountable system for the registration and funding of ECD (using a blockchain enabled platform, Amply), and finally enable rapid data collection and feedback to facilitate quality enhancement of programs at scale (the Early Learning Outcomes Measure). ChildConnect and CareUp were both developed using a design based iterative research approach. The usage and uptake of ChildConnect and CareUp was evaluated with qualitative and quantitative methods. Actual child outcomes were not measured in the initial pilots. Although parents who used and engaged on either platform felt more supported and informed, parent engagement and usage remains a challenge. This is contrast to ECD practitioners whose usage and knowledge with CareUp showed both sustained engagement and knowledge improvement. HearX is an easy-to-use tool to identify hearing loss and visual impairment. The tool was tested with 10000 children in an informal settlement. The feasibility of cost-effectively decentralising screening services was demonstrated. Practical and financial barriers remain with respect to parental consent and for successful referrals. Amply uses mobile and blockchain technology to increase impact and accountability of public services. In the pilot project, Amply is being used to replace an existing paper-based system to register children for a government-funded pre-school subsidy in South Africa. Early Learning Outcomes Measure defines what it means for a child to be developmentally ‘on track’ at aged 50-69 months. ELOM administration is enabled via a tablet which allows for easy and accurate data collection, transfer, analysis, and feedback. ELOM is being used extensively to drive quality enhancement of ECD programs across multiple modalities. The nature of ECD services in South Africa is that they are in large part provided by disconnected private individuals or Non-Governmental Organizations (in contrast to basic education which is publicly provided by the government). It is a disparate sector which means that scaling successful interventions is that much harder. All five interventions show the potential of technology to support and enhance a range of ECD services, but pathways to scale are still being tested.Keywords: assessment, behavior change, communication, data, disabilities, mobile, scale, technology, quality
Procedia PDF Downloads 13314803 Developing Learning in Organizations with Innovation Pedagogy Methods
Authors: T. Konst
Abstract:
Most jobs include training and communication tasks, but often the people in these jobs lack pedagogical competences to plan, implement and assess learning. This paper aims to discuss how a learning approach called innovation pedagogy developed in higher education can be utilized for learning development in various organizations. The methods presented how to implement innovation pedagogy such as process consultation and train the trainer model can provide added value to develop pedagogical knowhow in organizations and thus support their internal learning and development.Keywords: innovation pedagogy, learning, organizational development, process consultation
Procedia PDF Downloads 36714802 Arabic Handwriting Recognition Using Local Approach
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM
Procedia PDF Downloads 7114801 Research of Data Cleaning Methods Based on Dependency Rules
Authors: Yang Bao, Shi Wei Deng, WangQun Lin
Abstract:
This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.Keywords: data cleaning, dependency rules, violation data discovery, data repair
Procedia PDF Downloads 56414800 Auditing of Building Information Modeling Application in Decoration Engineering Projects in China
Authors: Lan Luo
Abstract:
In China’s construction industry, it is a normal practice to separately subcontract the decoration engineering part from construction engineering, and Building Information Modeling (BIM) is also done separately. Application of BIM in decoration engineering should be integrated with other disciplines, but Chinese current practice makes this very difficult and complicated. Currently, there are three barriers in the auditing of BIM application in decoration engineering in China: heavy workload; scarcity of qualified professionals; and lack of literature concerning audit contents, standards, and methods. Therefore, it is significant to perform research on what (contents) should be evaluated, in which phase, and by whom (professional qualifications) in BIM application in decoration construction so that the application of BIM can be promoted in a better manner. Based on this consideration, four principles of BIM auditing are proposed: Comprehensiveness of information, accuracy of data, aesthetic attractiveness of appearance, and scheme optimization. In the model audit, three methods should be used: Collision, observation, and contrast. In addition, BIM auditing at six stages is discussed and a checklist for work items and results to be submitted is proposed. This checklist can be used for reference by decoration project participants.Keywords: audit, evaluation, dimensions, methods, standards, BIM application in decoration engineering projects
Procedia PDF Downloads 34314799 Effect of Brewing on the Bioactive Compounds of Coffee
Authors: Ceyda Dadali, Yeşim Elmaci
Abstract:
Coffee was introduced as an economic crop during the fifteenth century; nowadays it is the most important food commodity ranking second after crude oil. Desirable sensory properties make coffee one of the most often consumed and most popular beverages in the world. The coffee preparation method has a significant effect on flavor and composition of coffee brews. Three different extraction methodologies namely decoction, infusion and pressure methods have been used for coffee brew preparation. Each of these methods is related to specific granulation (coffee grind) of coffee powder, water-coffee ratio temperature and brewing time. Coffee is a mixture of 1500 chemical compounds. Chemical composition of coffee highly depends on brewing methods, coffee bean species and roasting time-temperature. Coffee contains a wide number of very important bioactive compounds, such as diterpenes: cafestol and kahweol, alkaloids: caffeine, theobromine and trigonelline, melanoidins, phenolic compounds. The phenolic compounds of coffee include chlorogenic acids (quinyl esters of hidroxycinnamic acids), caffeic, ferulic, p-coumaric acid. In coffee caffeoylquinic acids, feruloylquinic acids and di-caffeoylquinic acids are three main groups of chlorogenic acids constitues 6% -10% of dry weight of coffee. The bioavailability of chlorogenic acids in coffee depends on the absorption and metabolization to biomarkers in individuals. Also, the interaction of coffee polyphenols with other compounds such as dietary proteins affects the biomarkers. Since bioactive composition of coffee depends on brewing methods effect of coffee brewing method on bioactive compounds of coffee will be discussed in this study.Keywords: bioactive compounds of coffee, biomarkers, coffee brew, effect of brewing
Procedia PDF Downloads 19614798 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood
Authors: Randa Alharbi, Vladislav Vyshemirsky
Abstract:
Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)
Procedia PDF Downloads 20214797 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 7514796 A Review on Application of Waste Tire in Concrete
Authors: M. A. Yazdi, J. Yang, L. Yihui, H. Su
Abstract:
The application of recycle waste tires into civil engineering practices, namely asphalt paving mixtures and cementbased materials has been gaining ground across the world. This review summarizes and compares the recent achievements in the area of plain rubberized concrete (PRC), in details. Different treatment methods have been discussed to improve the performance of rubberized Portland cement concrete. The review also includes the effects of size and amount of tire rubbers on mechanical and durability properties of PRC. The microstructure behaviour of the rubberized concrete was detailed.Keywords: waste rubber aggregates, microstructure, treatment methods, size and content effects
Procedia PDF Downloads 33214795 Optical Whitening of Textiles: Teaching and Learning Materials
Authors: C. W. Kan
Abstract:
This study examines the results of optical whitening process of different textiles such as cotton, wool and polyester. The optical whitening agents used are commercially available products, and the optical whitening agents were applied to the textiles with manufacturers’ suggested methods. The aim of this study is to illustrate the proper application methods of optical whitening agent to different textiles and hence to provide guidance note to the students in learning this topic. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: learning materials, optical whitening agent, wool, cotton, polyester
Procedia PDF Downloads 42514794 A Critical Reflection of Ableist Methodologies: Approaching Interviews and Go-Along Interviews
Authors: Hana Porkertová, Pavel Doboš
Abstract:
Based on a research project studying the experience of visually disabled people with urban space in the Czech Republic, the conference contribution discusses the limits of social-science methodologies used in sociology and human geography. It draws on actor-network theory, assuming that science does not describe reality but produces it. Methodology connects theory, research questions, ways to answer them (methods), and results. A research design utilizing ableist methodologies can produce ableist realities. Therefore, it was necessary to adjust the methods so that they could mediate blind experience to the scientific community without reproducing ableism. The researchers faced multiple challenges, ranging from questionable validity to how to research experience that differs from that of the researchers who are able-bodied. Finding a suitable theory that could be used as an analytical tool that would demonstrate space and blind experience as multiple, dynamic, and mutually constructed was the first step that could offer a range of potentially productive methods and research questions, as well as bring critically reflected results. Poststructural theory, mainly Deleuze-Guattarian philosophy, was chosen, and two methods were used: interviews and go-along interviews that had to be adjusted to be able to explore blind experience. In spite of a thorough preparation of these methods, new difficulties kept emerging, which exposed the ableist character of scientific knowledge. From the beginning of data collecting, there was an agreement to work in teams with slightly different roles of each of the researchers, which was significant especially during go-along interviews. In some cases, the anticipations of the researchers and participants differed, which led to unexpected and potentially dangerous situations. These were not caused only by the differences between scientific and lay communities but also between able-bodied and disabled people. Researchers were sometimes assigned to the assistants’ roles, and this new position – doing research together – required further negotiations, which also opened various ethical questions.Keywords: ableist methodology, blind experience, go-along interviews, research ethics, scientific knowledge
Procedia PDF Downloads 16514793 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 21714792 Embodied Empowerment: A Design Framework for Augmenting Human Agency in Assistive Technologies
Authors: Melina Kopke, Jelle Van Dijk
Abstract:
Persons with cognitive disabilities, such as Autism Spectrum Disorder (ASD) are often dependent on some form of professional support. Recent transformations in Dutch healthcare have spurred institutions to apply new, empowering methods and tools to enable their clients to cope (more) independently in daily life. Assistive Technologies (ATs) seem promising as empowering tools. While ATs can, functionally speaking, help people to perform certain activities without human assistance, we hold that, from a design-theoretical perspective, such technologies often fail to empower in a deeper sense. Most technologies serve either to prescribe or to monitor users’ actions, which in some sense objectifies them, rather than strengthening their agency. This paper proposes that theories of embodied interaction could help formulating a design vision in which interactive assistive devices augment, rather than replace, human agency and thereby add to a persons’ empowerment in daily life settings. It aims to close the gap between empowerment theory and the opportunities provided by assistive technologies, by showing how embodiment and empowerment theory can be applied in practice in the design of new, interactive assistive devices. Taking a Research-through-Design approach, we conducted a case study of designing to support independently living people with ASD with structuring daily activities. In three iterations we interlaced design action, active involvement and prototype evaluations with future end-users and healthcare professionals, and theoretical reflection. Our co-design sessions revealed the issue of handling daily activities being multidimensional. Not having the ability to self-manage one’s daily life has immense consequences on one’s self-image, and also has major effects on the relationship with professional caregivers. Over the course of the project relevant theoretical principles of both embodiment and empowerment theory together with user-insights, informed our design decisions. This resulted in a system of wireless light units that users can program as a reminder for tasks, but also to record and reflect on their actions. The iterative process helped to gradually refine and reframe our growing understanding of what it concretely means for a technology to empower a person in daily life. Drawing on the case study insights we propose a set of concrete design principles that together form what we call the embodied empowerment design framework. The framework includes four main principles: Enabling ‘reflection-in-action’; making information ‘publicly available’ in order to enable co-reflection and social coupling; enabling the implementation of shared reflections into an ‘endurable-external feedback loop’ embedded in the persons familiar ’lifeworld’; and nudging situated actions with self-created action-affordances. In essence, the framework aims for the self-development of a suitable routine, or ‘situated practice’, by building on a growing shared insight of what works for the person. The framework, we propose, may serve as a starting point for AT designers to create truly empowering interactive products. In a set of follow-up projects involving the participation of persons with ASD, Intellectual Disabilities, Dementia and Acquired Brain Injury, the framework will be applied, evaluated and further refined.Keywords: assistive technology, design, embodiment, empowerment
Procedia PDF Downloads 27814791 Disease Characteristics of Neurofibromatosis Type II and Cochlear Implantation
Authors: Boxiang Zhuang
Abstract:
This study analyzes the clinical manifestations, hearing rehabilitation methods and outcomes of a complex case of neurofibromatosis type II (NF2). Methods: The clinical manifestations, medical history, clinical data, surgical methods and postoperative hearing rehabilitation outcomes of an NF2 patient were analyzed to determine the hearing reconstruction method and postoperative effect for a special type of NF2 acoustic neuroma. Results: The patient had bilateral acoustic neuromas with profound sensorineural hearing loss in both ears. Peripheral blood genetic testing did not reveal pathogenic gene mutations, suggesting mosaicism. The patient had an intracochlear schwannoma in the right ear and severely impaired vision in both eyes. Cochlear implantation with tumor retention was performed in the right ear. After 2 months of family-based auditory and speech rehabilitation, the Categories of Auditory Performance (CAP) score improved from 0 to 5. Conclusion: NF2 has complex clinical manifestations and poor prognosis. For NF2 patients with intracochlear tumors, cochlear implantation with tumor retention can be used to reconstruct hearing.Keywords: NF2, intracochlear schwannoma, hearing reconstruction, cochlear implantation
Procedia PDF Downloads 1314790 Refined Edge Detection Network
Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni
Abstract:
Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone
Procedia PDF Downloads 10214789 The Impact of Ultrasonic Field to Increase the Biodegradability of Leachate from The Landfill
Authors: Kwarciak-Kozlowska A., Slawik-Dembiczak L., Galwa-Widera M.
Abstract:
Complex and variable during operation of the landfill leachate composition prevents the use of a single universal method of their purification. Due to the presence of difficult biodegradable these substances in the wastewater, cleaning of them often requires the use of biological methods (activated sludge or anaerobic digestion), also often supporting by physicochemical processes. Currently, more attention is paid to the development of unconventional methods of disposal of sewage m.in ultleniania advanced methods including the use of ultrasonic waves. It was assumed that the ultrasonic waves induce change in the structure of organic compounds and contribute to the acceleration of biodegradability, including refractive substances in the leachate, so that will increase the effectiveness of their treatment in biological processes. We observed a marked increase in BOD leachate when subjected to the action of utradźwięowego. Ratio BOD / COD was 27% higher compared to the value of this ratio for leachate nienadźwiękawianych. It was found that the process of sonification leachate clearly influenced the formation and release of aliphatic compounds. These changes suggest a possible violation of the chemical structure of organic compounds in the leachate thereby give compounds of the chemical structure more susceptible to biodegradation.Keywords: IR spectra, landfill leachate, organic pollutants, ultrasound
Procedia PDF Downloads 429