Search results for: map optimization tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7847

Search results for: map optimization tool

5447 About the Case Portfolio Management Algorithms and Their Applications

Authors: M. Chumburidze, N. Salia, T. Namchevadze

Abstract:

This work deal with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.

Keywords: credit network, case portfolio, binary tree, priority queue, stack

Procedia PDF Downloads 137
5446 The Reasons behind Individuals to Join Terrorist Organizations: Recruitment from Outside

Authors: Murat Sözen

Abstract:

Today terrorism is gaining momentum again. Parallel to this, it hurts more than before because it has victims from not only its own locations but also remote places. As victims are from outside, militants are likewise from own location and outside. What made these individuals join the terrorist organizations and how these organizations recruit militants are still unanswered. The purpose of this work is to find reasons of joining and power of recruiting. In addition, the role of most popular tool of recruiting, ‘social media’ will be examined.

Keywords: recruitment, social media, recruitment, militants

Procedia PDF Downloads 341
5445 Application of the Standard Deviation in Regulating Design Variation of Urban Solutions Generated through Evolutionary Computation

Authors: Mohammed Makki, Milad Showkatbakhsh, Aiman Tabony

Abstract:

Computational applications of natural evolutionary processes as problem-solving tools have been well established since the mid-20th century. However, their application within architecture and design has only gained ground in recent years, with an increasing number of academics and professionals in the field electing to utilize evolutionary computation to address problems comprised from multiple conflicting objectives with no clear optimal solution. Recent advances in computer science and its consequent constructive influence on the architectural discourse has led to the emergence of multiple algorithmic processes capable of simulating the evolutionary process in nature within an efficient timescale. Many of the developed processes of generating a population of candidate solutions to a design problem through an evolutionary based stochastic search process are often driven through the application of both environmental and architectural parameters. These methods allow for conflicting objectives to be simultaneously, independently, and objectively optimized. This is an essential approach in design problems with a final product that must address the demand of a multitude of individuals with various requirements. However, one of the main challenges encountered through the application of an evolutionary process as a design tool is the ability for the simulation to maintain variation amongst design solutions in the population while simultaneously increasing in fitness. This is most commonly known as the ‘golden rule’ of balancing exploration and exploitation over time; the difficulty of achieving this balance in the simulation is due to the tendency of either variation or optimization being favored as the simulation progresses. In such cases, the generated population of candidate solutions has either optimized very early in the simulation, or has continued to maintain high levels of variation to which an optimal set could not be discerned; thus, providing the user with a solution set that has not evolved efficiently to the objectives outlined in the problem at hand. As such, the experiments presented in this paper seek to achieve the ‘golden rule’ by incorporating a mathematical fitness criterion for the development of an urban tissue comprised from the superblock as its primary architectural element. The mathematical value investigated in the experiments is the standard deviation factor. Traditionally, the standard deviation factor has been used as an analytical value rather than a generative one, conventionally used to measure the distribution of variation within a population by calculating the degree by which the majority of the population deviates from the mean. A higher standard deviation value delineates a higher number of the population is clustered around the mean and thus limited variation within the population, while a lower standard deviation value is due to greater variation within the population and a lack of convergence towards an optimal solution. The results presented will aim to clarify the extent to which the utilization of the standard deviation factor as a fitness criterion can be advantageous to generating fitter individuals in a more efficient timeframe when compared to conventional simulations that only incorporate architectural and environmental parameters.

Keywords: architecture, computation, evolution, standard deviation, urban

Procedia PDF Downloads 129
5444 Development of Technologies for Biotransformation of Aquatic Biological Resources for the Production of Functional, Specialized, Therapeutic, Preventive, and Microbiological Products

Authors: Kira Rysakova, Vitaly Novikov

Abstract:

An improved method of obtaining enzymatic collagen hydrolysate from the tissues of marine hydrobionts is proposed, which allows to obtain hydrolysate without pre-isolation of pure collagen. The method can be used to isolate enzymatic collagen hydrolysate from the waste of industrial processing of Red King crab and non-traditional objects - marine holothurias. Comparative analysis of collagen hydrolysates has shown the possibility of their use in a number of nutrient media, but this requires additional optimization of their composition and biological tests on wide sets of test strains of microorganisms.

Keywords: collagen hydrolysate, marine hydrobionts, red king crab, marine holothurias, enzymes, exclusive HPLC

Procedia PDF Downloads 161
5443 Optimization of Transmission Loss on a Series-Coupled Muffler by Taguchi Method

Authors: Jing-Fung Lin, Jer-Jia Sheu

Abstract:

In this study, an approach has been developed for the noise reduction of a muffler. The transmission loss (TL) in the muffler is maximized by the use of a double-chamber muffler, and a baffle with a hole is inserted between chambers. Taguchi method is used to optimize the design for the acoustical performance of the muffler. The TL performance is evaluated by COMSOL software. The excellent parameter combination for the maximum TL is attained as high as 35.30 dB in a wide frequency range from 10 Hz to 1400 Hz. The influence sequence of four parameters on TL is determined by the range analysis. The effects of length and expansion ratio of the first chamber on TL performance for the excellent program were discussed. Comparisons of the TL results from different designs are made.

Keywords: acoustics, baffle, chamber, muffler, Taguchi method, transmission loss

Procedia PDF Downloads 110
5442 An Approach on Intelligent Tolerancing of Car Body Parts Based on Historical Measurement Data

Authors: Kai Warsoenke, Maik Mackiewicz

Abstract:

To achieve a high quality of assembled car body structures, tolerancing is used to ensure a geometric accuracy of the single car body parts. There are two main techniques to determine the required tolerances. The first is tolerance analysis which describes the influence of individually tolerated input values on a required target value. Second is tolerance synthesis to determine the location of individual tolerances to achieve a target value. Both techniques are based on classical statistical methods, which assume certain probability distributions. To ensure competitiveness in both saturated and dynamic markets, production processes in vehicle manufacturing must be flexible and efficient. The dimensional specifications selected for the individual body components and the resulting assemblies have a major influence of the quality of the process. For example, in the manufacturing of forming tools as operating equipment or in the higher level of car body assembly. As part of the metrological process monitoring, manufactured individual parts and assemblies are recorded and the measurement results are stored in databases. They serve as information for the temporary adjustment of the production processes and are interpreted by experts in order to derive suitable adjustments measures. In the production of forming tools, this means that time-consuming and costly changes of the tool surface have to be made, while in the body shop, uncertainties that are difficult to control result in cost-intensive rework. The stored measurement results are not used to intelligently design tolerances in future processes or to support temporary decisions based on real-world geometric data. They offer potential to extend the tolerancing methods through data analysis and machine learning models. The purpose of this paper is to examine real-world measurement data from individual car body components, as well as assemblies, in order to develop an approach for using the data in short-term actions and future projects. For this reason, the measurement data will be analyzed descriptively in the first step in order to characterize their behavior and to determine possible correlations. In the following, a database is created that is suitable for developing machine learning models. The objective is to create an intelligent way to determine the position and number of measurement points as well as the local tolerance range. For this a number of different model types are compared and evaluated. The models with the best result are used to optimize equally distributed measuring points on unknown car body part geometries and to assign tolerance ranges to them. The current results of this investigation are still in progress. However, there are areas of the car body parts which behave more sensitively compared to the overall part and indicate that intelligent tolerancing is useful here in order to design and control preceding and succeeding processes more efficiently.

Keywords: automotive production, machine learning, process optimization, smart tolerancing

Procedia PDF Downloads 104
5441 International E-Learning for Assuring Ergonomic Working Conditions of Orthopaedic Surgeons: First Research Outcomes from Train4OrthoMIS

Authors: J. Bartnicka, J. A. Piedrabuena, R. Portilla, L. Moyano - Cuevas, J. B. Pagador, P. Augat, J. Tokarczyk, F. M. Sánchez Margallo

Abstract:

Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation. The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements”. The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.

Keywords: international e-learning, ergonomics, orthopaedic surgery, Train4OrthoMIS

Procedia PDF Downloads 177
5440 Flexible Coupling between Gearbox and Pump (High Speed Machine)

Authors: Naif Mohsen Alharbi

Abstract:

This paper present failure occurred on flexible coupling installed at oil anf gas operation. Also it presents maintenance ideas implemented on the flexible coupling installed to transmit high torque from gearbox to pump. Basically, the machine train is including steam turbine which drives the pump and there is gearbox located in between for speed reduction. investigation are identifying the root causes, solving and developing the technology designs or bad actor. This report provides the study intentionally for continues operation optimization, utilize the advanced opportunity and implement a improvement. Objective: The main objectives of the investigation are identifying the root causes, solving and developing the technology designs or bad actor. Ultimately, fulfilling the operation productivity, also ensuring better technology, quality and design by solutions. This report provides the study intentionally for continues operation optimization, utilize the advanced opportunity and implemet improvement. Method: The method used in this project was a very focused root cause analysis procedure that incorporated engineering analysis and measurements. The analysis method extensively covers the measuring of the complete coupling dimensions. Including the membranes thickness, hubs, bore diameter and total length, dismantle flexible coupling to diagnose how deep the coupling has been affected. Also, defining failure modes, so that the causes could be identified and verified. Moreover, Vibration analysis and metallurgy test. Lastly applying several solutions by advanced tools (will be mentioned in detail). Results and observation: Design capacity: Coupling capacity is an inadequate to fulfil 100% of operating conditions. Therefore, design modification of service factor to be at least 2.07 is crucial to address this issue and prevent recurrence of similar scenario, especially for the new upgrading project. Discharge fluctuation: High torque flexible coupling encountered during the operation. Therefore, discharge valve behaviour, tuning, set point and general conditions revaluated and modified subsequently, it can be used as baseline for upcoming Coupling design project. Metallurgy test: Material of flexible coupling membrane (discs) tested at the lab, for a detailed metallurgical investigation, better material grade has been selected for our operating conditions,

Keywords: high speed machine, reliabilty, flexible coupling, rotating equipment

Procedia PDF Downloads 60
5439 Measuring Fundamental Growth Needs in a Youth Boatbuilding Context

Authors: Shane Theunissen, Rob Grandy

Abstract:

Historically and we would fairly conventionally within our formal schooling systems, we have convergent testing where all the students are expected to converge on the same answer, and that answer has been determined by an external authority that is reproducing knowledge of the hegemon. Many youths may not embody the cultural capital that's rewarded in formal schooling contexts as they aren't able to converge on the required answer that's being determined by the classroom teacher or the administrators. In this paper, we explore divergent processes that promote creative problem-solving. We embody this divergent process in our measurement of fundamental growth needs. To this end, we utilize the Mosaic Approach as a method for implementing the Outcomes That Matter framework. Outcomes That Matter is the name of the measurement tool built around the Circle of Courage framework, which is a way of identifying fundamental growth needs for young people. The Circle of Courage was developed by Martin-Broken-Leg and colleagues as a way to connect indigenous child-rearing philosophies with contemporary resilience and positive psychology research. The Outcomes that Matter framework puts forward four categories of growth needs for young people. These are: Belonging, which on a macro scale is acceptance into the greater community of practice, Mastery which includes a constellation of concepts including confidence, motivation, self-actualization, and self-determination, Independence refers to a sense of personal power into autonomy within a context where creativity and problem solving, and a personal voice can begin to emerge, and finally Generosity which includes interpersonal things like conflict resolution and teamwork. Outcomes of Matter puts these four domains into a measurement tool that facilitates collaborative assessment between the youth, teachers, and recreation therapists that allows for youth-led narratives pertaining to their fundamental growth outcomes. This application of the Outcomes That Matter framework is unique as it may be the first application of this framework in an educational boatbuilding context.

Keywords: collaboration, empowerment, outcomes that matter, mosaic approach, boat building

Procedia PDF Downloads 92
5438 Economic Evaluation of Degradation by Corrosion of an On-Grid Battery Energy Storage System: A Case Study in Algeria Territory

Authors: Fouzia Brihmat

Abstract:

Economic planning models, which are used to build microgrids and distributed energy resources, are the current norm for expressing such confidence (DER). These models often decide both short-term DER dispatch and long-term DER investments. This research investigates the most cost-effective hybrid (photovoltaic-diesel) renewable energy system (HRES) based on Total Net Present Cost (TNPC) in an Algerian Saharan area, which has a high potential for solar irradiation and has a production capacity of 1GW/h. Lead-acid batteries have been around much longer and are easier to understand, but have limited storage capacity. Lithium-ion batteries last longer, are lighter, but generally more expensive. By combining the advantages of each chemistry, we produce cost-effective high-capacity battery banks that operate solely on AC coupling. The financial implications of this research describe the corrosion process that occurs at the interface between the active material and grid material of the positive plate of a lead-acid battery. The best cost study for the HRES is completed with the assistance of the HOMER Pro MATLAB Link. Additionally, during the course of the project's 20 years, the system is simulated for each time step. In this model, which takes into consideration decline in solar efficiency, changes in battery storage levels over time, and rises in fuel prices above the rate of inflation. The trade-off is that the model is more accurate, but it took longer to compute. As a consequence, the model is more precise, but the computation takes longer. We initially utilized the Optimizer to run the model without MultiYear in order to discover the best system architecture. The optimal system for the single-year scenario is the Danvest generator, which has 760 kW, 200 kWh of the necessary quantity of lead-acid storage, and a somewhat lower COE of $0.309/kWh. Different scenarios that account for fluctuations in the gasified biomass generator's production of electricity have been simulated, and various strategies to guarantee the balance between generation and consumption have been investigated. The technological optimization of the same system has been finished and is being reviewed in a recent paper study.

Keywords: battery, corrosion, diesel, economic planning optimization, hybrid energy system, lead-acid battery, multi-year planning, microgrid, price forecast, PV, total net present cost

Procedia PDF Downloads 81
5437 A Reduced Ablation Model for Laser Cutting and Laser Drilling

Authors: Torsten Hermanns, Thoufik Al Khawli, Wolfgang Schulz

Abstract:

In laser cutting as well as in long pulsed laser drilling of metals, it can be demonstrated that the ablation shape (the shape of cut faces respectively the hole shape) that is formed approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from the ultrashort pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in laser cutting and long pulse drilling of metals is identified, its underlying mechanism numerically implemented, tested and clearly confirmed by comparison with experimental data. In detail, there now is a model that allows the simulation of the temporal (pulse-resolved) evolution of the hole shape in laser drilling as well as the final (asymptotic) shape of the cut faces in laser cutting. This simulation especially requires much less in the way of resources, such that it can even run on common desktop PCs or laptops. Individual parameters can be adjusted using sliders – the simulation result appears in an adjacent window and changes in real time. This is made possible by an application-specific reduction of the underlying ablation model. Because this reduction dramatically decreases the complexity of calculation, it produces a result much more quickly. This means that the simulation can be carried out directly at the laser machine. Time-intensive experiments can be reduced and set-up processes can be completed much faster. The high speed of simulation also opens up a range of entirely different options, such as metamodeling. Suitable for complex applications with many parameters, metamodeling involves generating high-dimensional data sets with the parameters and several evaluation criteria for process and product quality. These sets can then be used to create individual process maps that show the dependency of individual parameter pairs. This advanced simulation makes it possible to find global and local extreme values through mathematical manipulation. Such simultaneous optimization of multiple parameters is scarcely possible by experimental means. This means that new methods in manufacturing such as self-optimization can be executed much faster. However, the software’s potential does not stop there; time-intensive calculations exist in many areas of industry. In laser welding or laser additive manufacturing, for example, the simulation of thermal induced residual stresses still uses up considerable computing capacity or is even not possible. Transferring the principle of reduced models promises substantial savings there, too.

Keywords: asymptotic ablation shape, interactive process simulation, laser drilling, laser cutting, metamodeling, reduced modeling

Procedia PDF Downloads 209
5436 Understanding the First Mental Breakdown from the Families’ Perspective Through Metaphors

Authors: Eli Buchbinder

Abstract:

Introduction. Language is the basis to our experience as human being. We use language in describing our experiences and construct meaning and narratives from experiences. Metaphors are a valuable linguistic tool commonly use. Metaphors link two domains that are ordinarily not related. Metaphors achieve simultaneously multi-level integration: abstract and concrete, rational and imaginative, familiar and the unfamiliar, conscious and preconscious/unconscious. As such, metaphors epistemological and ontological tool that are important in social work in every field and domain. Goals and Methods The presentation’s aim is to validate the value of metaphors through the first psychiatric breakdown is a traumatic for families. The presentation is based on two pooled qualitative studies. The first study focused on 12 spouses: 7 women and 5 men, between the ages of 22 and 57, regarding their experiences and meanings of the first psychiatric hospitalization of their partners diagnosed with affective disorders. The second study focused on 10 parents, between the ages of 47 and 62, regarding their experiences and meanings following their child's first psychotic breakdown during young adulthood. Results Two types of major metaphors evolved from the interviews in farming the trauma of the first mental breakdown. The first mode - orientation (spatial) metaphors, reflect symbolic expression of the loss of a secure base, represented in the physical environment, e.g., describing hospitalization as "falling into an abyss." The second mode- ontological metaphors, reflect how parents and spouses present their traumatic experiences of hospitalization in terms of discrete, powerful and coherent entities, e.g., describing the first hospitalization as "swimming against the tide." The two metaphors modes reflect the embodiment of the unpredictability, being mired in distress, shock, intense pain and the experience the collapse of continuity on the life course and cuts off the experience of control. Conclusions Metaphors are important and powerful guide in assessing individuals and families’ phenomenological reality. As such, metaphors are useful for understanding and orientated therapeutic intervening, in the studies above, with the first psychiatric hospitalization experienced, as well as in others social workers’ interventions.

Keywords: first mental breakdown, metaphors, family perspective, qualitative research

Procedia PDF Downloads 68
5435 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring

Authors: Younghoon Kim, Seoung Bum Kim

Abstract:

One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.

Keywords: control chart, mixed integer programming, one-class classification, support vector data description

Procedia PDF Downloads 170
5434 On the Optimization of a Decentralized Photovoltaic System

Authors: Zaouche Khelil, Talha Abdelaziz, Berkouk El Madjid

Abstract:

In this paper, we present a grid-tied photovoltaic system. The studied topology is structured around a seven-level inverter, supplying a non-linear load. A three-stage step-up DC/DC converter ensures DC-link balancing. The presented system allows the extraction of all the available photovoltaic power. This extracted energy feeds the local load; the surplus energy is injected into the electrical network. During poor weather conditions, where the photovoltaic panels cannot meet the energy needs of the load, the missing power is supplied by the electrical network. At the common connexion point, the network current shows excellent spectral performances.

Keywords: seven-level inverter, multi-level DC/DC converter, photovoltaic, non-linear load

Procedia PDF Downloads 180
5433 Overview of Adaptive Spline interpolation

Authors: Rongli Gai, Zhiyuan Chang

Abstract:

At this stage, in view of various situations in the interpolation process, most researchers use self-adaptation to adjust the interpolation process, which is also one of the current and future research hotspots in the field of CNC machining. In the interpolation process, according to the overview of the spline curve interpolation algorithm, the adaptive analysis is carried out from the factors affecting the interpolation process. The adaptive operation is reflected in various aspects, such as speed, parameters, errors, nodes, feed rates, random Period, sensitive point, step size, curvature, adaptive segmentation, adaptive optimization, etc. This paper will analyze and summarize the research of adaptive imputation in the direction of the above factors affecting imputation.

Keywords: adaptive algorithm, CNC machining, interpolation constraints, spline curve interpolation

Procedia PDF Downloads 189
5432 Rhizobium leguminosarum: Selecting Strain and Exploring Delivery Systems for White Clover

Authors: Laura Villamizar, David Wright, Claudia Baena, Marie Foxwell, Maureen O'Callaghan

Abstract:

Leguminous crops can be self-sufficient for their nitrogen requirements when their roots are nodulated with an effective Rhizobium strain and for this reason seed or soil inoculation is practiced worldwide to ensure nodulation and nitrogen fixation in grain and forage legumes. The most widely used method of applying commercially available inoculants is using peat cultures which are coated onto seeds prior to sowing. In general, rhizobia survive well in peat, but some species die rapidly after inoculation onto seeds. The development of improved formulation methodology is essential to achieve extended persistence of rhizobia on seeds, and improved efficacy. Formulations could be solid or liquid. Most popular solid formulations or delivery systems are: wettable powders (WP), water dispersible granules (WG), and granules (DG). Liquid formulation generally are: suspension concentrates (SC) or emulsifiable concentrates (EC). In New Zealand, R. leguminosarum bv. trifolii strain TA1 has been used as a commercial inoculant for white clover over wide areas for many years. Seeds inoculation is carried out by mixing the seeds with inoculated peat, some adherents and lime, but rhizobial populations on stored seeds decline over several weeks due to a number of factors including desiccation and antibacterial compounds produced by the seeds. In order to develop a more stable and suitable delivery system to incorporate rhizobia in pastures, two strains of R. leguminosarum (TA1 and CC275e) and several formulations and processes were explored (peat granules, self-sticky peat for seed coating, emulsions and a powder containing spray dried microcapsules). Emulsions prepared with fresh broth of strain TA1 were very unstable under storage and after seed inoculation. Formulations where inoculated peat was used as the active ingredient were significantly more stable than those prepared with fresh broth. The strain CC275e was more tolerant to stress conditions generated during formulation and seed storage. Peat granules and peat inoculated seeds using strain CC275e maintained an acceptable loading of 108 CFU/g of granules or 105 CFU/g of seeds respectively, during six months of storage at room temperature. Strain CC275e inoculated on peat was also microencapsulated with a natural biopolymer by spray drying and after optimizing operational conditions, microparticles containing 107 CFU/g and a mean particle size between 10 and 30 micrometers were obtained. Survival of rhizobia during storage of the microcapsules is being assessed. The development of a stable product depends on selecting an active ingredient (microorganism), robust enough to tolerate some adverse conditions generated during formulation, storage, and commercialization and after its use in the field. However, the design and development of an adequate formulation, using compatible ingredients, optimization of the formulation process and selecting the appropriate delivery system, is possibly the best tool to overcome the poor survival of rhizobia and provide farmers with better quality inoculants to use.

Keywords: formulation, Rhizobium leguminosarum, storage stability, white clover

Procedia PDF Downloads 145
5431 Water Quality Trading with Equitable Total Maximum Daily Loads

Authors: S. Jamshidi, E. Feizi Ashtiani, M. Ardestani, A. Feizi Ashtiani

Abstract:

Waste load allocation (WLA) strategies usually intend to find economical policies for water resource management. Water quality trading (WQT) is an approach that uses discharge permit market to reduce total environmental protection costs. This primarily requires assigning discharge limits known as total maximum daily loads (TMDLs). These are determined by monitoring organizations with respect to the receiving water quality and remediation capabilities. The purpose of this study is to compare two approaches of TMDL assignment for WQT policy in small catchment area of Haraz River, in north of Iran. At first, TMDLs are assigned uniformly for the whole point sources to keep the concentrations of BOD and dissolved oxygen (DO) at the standard level at checkpoint (terminus point). This was simply simulated and controlled by Qual2kw software. In the second scenario, TMDLs are assigned using multi objective particle swarm optimization (MOPSO) method in which the environmental violation at river basin and total treatment costs are minimized simultaneously. In both scenarios, the equity index and the WLA based on trading discharge permits (TDP) are calculated. The comparative results showed that using economically optimized TMDLs (2nd scenario) has slightly more cost savings rather than uniform TMDL approach (1st scenario). The former annually costs about 1 M$ while the latter is 1.15 M$. WQT can decrease these annual costs to 0.9 and 1.1 M$, respectively. In other word, these approaches may save 35 and 45% economically in comparison with command and control policy. It means that using multi objective decision support systems (DSS) may find more economical WLA, however its outcome is not necessarily significant in comparison with uniform TMDLs. This may be due to the similar impact factors of dischargers in small catchments. Conversely, using uniform TMDLs for WQT brings more equity that makes stakeholders not feel that much envious of difference between TMDL and WQT allocation. In addition, for this case, determination of TMDLs uniformly would be much easier for monitoring. Consequently, uniform TMDL for TDP market is recommended as a sustainable approach. However, economical TMDLs can be used for larger watersheds.

Keywords: waste load allocation (WLA), water quality trading (WQT), total maximum daily loads (TMDLs), Haraz River, multi objective particle swarm optimization (MOPSO), equity

Procedia PDF Downloads 388
5430 Pedagogical Tools In The 21st Century

Authors: M. Aherrahrou

Abstract:

Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.

Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning

Procedia PDF Downloads 350
5429 Implementing Quality Function Deployment Tool for a Customer Driven New Product Development in a Kuwait SME

Authors: Asma AlQahtani, Jumana AlHadad, Maryam AlQallaf, Shoug AlHasan

Abstract:

New product development (NPD) is the complete process of bringing a new product to the customer by integrating the two broad divisions; one involving the idea generation, product design and detail engineering; and the other involving market research and marketing analysis. It is a common practice for companies to undertake some of these tasks simultaneously (concurrent engineering) and also consider them as an ongoing process (continuous development). The current study explores the framework and methodology for a new product development process utilizing the Quality Function Deployment (QFD) tool for bringing the customer opinion into the product development process. An elaborate customer survey with focus groups in the region was carried out to ensure that customer requirements are integrated into new products as early as the design stage including identifying the recognition of need for the new product. A QFD Matrix (House of Quality) was prepared that links customer requirements to product engineering requirements and a feasibility study and risk assessment exercise was carried out for a Small and Medium Enterprise (SME) in Kuwait for development of the new product. SMEs in Kuwait, particularly in manufacturing sector are mainly focused on serving the local demand, and often lack of product quality adversely affects the ability of the companies to compete on a regional/global basis. Further, lack of focus on identifying customer requirements often deters SMEs to envisage the idea of a New Product Development. The current study therefore focuses in utilizing QFD Matrix right from the conceptual design to detail design and to some extent, extending the link this to design of the manufacturing system. The outcome of the project resulted in a development of the prototype for a new molded product which can ensure consistency between the customer’s requirements and the measurable characteristics of the product. The Engineering Economics and Cost studies were also undertaken to analyse the viability of the new product, the results of which was also linked to the successful implementation of the initial QFD Matrix.

Keywords: Quality Function Deployment, QFD Matrix, new product development, NPD, Kuwait SMEs, prototype development

Procedia PDF Downloads 408
5428 Efficient Estimation of Maximum Theoretical Productivity from Batch Cultures via Dynamic Optimization of Flux Balance Models

Authors: Peter C. St. John, Michael F. Crowley, Yannick J. Bomble

Abstract:

Production of chemicals from engineered organisms in a batch culture typically involves a trade-off between productivity, yield, and titer. However, strategies for strain design typically involve designing mutations to achieve the highest yield possible while maintaining growth viability. Such approaches tend to follow the principle of designing static networks with minimum metabolic functionality to achieve desired yields. While these methods are computationally tractable, optimum productivity is likely achieved by a dynamic strategy, in which intracellular fluxes change their distribution over time. One can use multi-stage fermentations to increase either productivity or yield. Such strategies would range from simple manipulations (aerobic growth phase, anaerobic production phase), to more complex genetic toggle switches. Additionally, some computational methods can also be developed to aid in optimizing two-stage fermentation systems. One can assume an initial control strategy (i.e., a single reaction target) in maximizing productivity - but it is unclear how close this productivity would come to a global optimum. The calculation of maximum theoretical yield in metabolic engineering can help guide strain and pathway selection for static strain design efforts. Here, we present a method for the calculation of a maximum theoretical productivity of a batch culture system. This method follows the traditional assumptions of dynamic flux balance analysis: that internal metabolite fluxes are governed by a pseudo-steady state and external metabolite fluxes are represented by dynamic system including Michealis-Menten or hill-type regulation. The productivity optimization is achieved via dynamic programming, and accounts explicitly for an arbitrary number of fermentation stages and flux variable changes. We have applied our method to succinate production in two common microbial hosts: E. coli and A. succinogenes. The method can be further extended to calculate the complete productivity versus yield Pareto surface. Our results demonstrate that nearly optimal yields and productivities can indeed be achieved with only two discrete flux stages.

Keywords: A. succinogenes, E. coli, metabolic engineering, metabolite fluxes, multi-stage fermentations, succinate

Procedia PDF Downloads 208
5427 Physicochemical Characterization of Mercerized Cellulose-Supported Nickel-Oxide

Authors: Sherif M. A. S. Keshk, Hisham S. M. Abd-Rabboh, Mohamed S. Hamdy, Ibrahim H. A. Badr

Abstract:

Microwave radiation was applied to synthesize nanoparticles of nickel oxide supported on pretreated cellulose with metal acetate in the presence of NaOH. Optimization, in terms of irradiation time and metal concentration, was investigated. FT-IR spectrum of cellulose/NiO spectrum shows a band at 445 cm^-1 that is related to the Ni–O stretching vibration of NiO6 octahedral in the cubic NiO structure. cellulose/NiO showed similar XRD pattern of cellulose I and exhibited sharpened reflection peak at 2q = 29.8°, corresponding to (111) plane of NiO, with two weak broad peaks at 48.5°, and 49.2°, representing (222) planes of NiO. XPS spectrum of mercerized cellulose/NiO composite showed did not show any peaks corresponding to Na ion.

Keywords: cellulose, mercerized cellulose, cellulose/zinc and nickeloxides composite, FTIR, XRD, XPS, SEM, Raman spectrum

Procedia PDF Downloads 436
5426 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology

Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco

Abstract:

Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.

Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning

Procedia PDF Downloads 65
5425 Study of the Stability of Underground Mines by Numerical Method: The Mine Chaabet El Hamra, Algeria

Authors: Nakache Radouane, M. Boukelloul, M. Fredj

Abstract:

Method room and pillar sizes are key factors for safe mining and their recovery in open-stop mining. This method is advantageous due to its simplicity and requirement of little information to be used. It is probably the most representative method among the total load approach methods although it also remains a safe design method. Using a finite element software (PLAXIS 3D), analyses were carried out with an elasto-plastic model and comparisons were made with methods based on the total load approach. The results were presented as the optimization for improving the ore recovery rate while maintaining a safe working environment.

Keywords: room and pillar, mining, total load approach, elasto-plastic

Procedia PDF Downloads 323
5424 The Use of Coronary Calcium Scanning for Cholesterol Assessment and Management

Authors: Eva Kirzner

Abstract:

Based on outcome studies published over the past two decades, in 2018, the ACC/AHA published new guidelines for the management of hypercholesterolemia that incorporate the use of coronary artery calcium (CAC) scanning as a decision tool for ascertaining which patients may benefit from statin therapy. This use is based on the recognition that the absence of calcium on CAC scanning (i.e., a CAC score of zero) usually signifies the absence of significant atherosclerotic deposits in the coronary arteries. Specifically, in patients with a high risk for atherosclerotic cardiovascular disease (ASCVD), initiation of statin therapy is generally recommended to decrease ASCVD risk. However, among patients with intermediate ASCVD risk, the need for statin therapy is less certain. However, there is a need for new outcome studies that provide evidence that the management of hypercholesterolemia based on these new ACC/AHA recommendations is safe for patients. Based on a Pub-Med and Google Scholar literature search, four relevant population-based or patient-based cohort studies that studied the relationship between CAC scanning, risk assessment or mortality, and statin therapy that were published between 2017 and 2021 were identified (see references). In each of these studies, patients were assessed for their baseline risk for atherosclerotic cardiovascular disease (ASCVD) using the Pooled Cohorts Equation (PCE), an ACC/AHA calculator for determining patient risk based on assessment of patient age, gender, ethnicity, and coronary artery disease risk factors. The combined findings of these four studies provided concordant evidence that a zero CAC score defines patients who remain at low clinical risk despite the non-use of statin therapy. Thus, these new studies confirm the use of CAC scanning as a safe tool for reducing the potential overuse of statin therapy among patients with zero CAC scores. Incorporating these new data suggest the following best practice: (1) ascertain ASCVD risk according to the PCE in all patients; (2) following an initial attempt trial to lower ASCVD risk with optimal diet among patients with elevated ASCVD risk, initiate statin therapy for patients who have a high ASCVD risk score; (3) if the ASCVD score is intermediate, refer patients for CAC scanning; and (4) and if the CAC score is zero among the intermediate risk ASCVD patients, statin therapy can be safely withheld despite the presence of an elevated serum cholesterol level.

Keywords: cholesterol, cardiovascular disease, statin therapy, coronary calcium

Procedia PDF Downloads 105
5423 Online-Scaffolding-Learning Tools to Improve First-Year Undergraduate Engineering Students’ Self-Regulated Learning Abilities

Authors: Chen Wang, Gerard Rowe

Abstract:

The number of undergraduate engineering students enrolled in university has been increasing rapidly recently, leading to challenges associated with increased student-instructor ratios and increased diversity in academic preparedness of the entrants. An increased student-instructor ratio makes the interaction between teachers and students more difficult, with the resulting student ‘anonymity’ known to be a risk to academic success. With increasing student numbers, there is also an increasing diversity in the academic preparedness of the students at entry to university. Conceptual understanding of the entrants has been quantified via diagnostic testing, with the results for the first-year course in electrical engineering showing significant conceptual misunderstandings amongst the entry cohort. The solution is clearly multi-faceted, but part of the solution likely involves greater demands being placed on students to be masters of their own learning. In consequence, it is highly desirable that instructors help students to develop better self-regulated learning skills. A self-regulated learner is one who is capable of setting up their own learning goals, monitoring their study processes, adopting and adjusting learning strategies, and reflecting on their own study achievements. The methods by which instructors might cultivate students’ self-regulated learning abilities is receiving increasing attention from instructors and researchers. The aim of this study was to help students understand fully their self-regulated learning skill levels and provide targeted instructions to help them improve particular learning abilities in order to meet the curriculum requirements. As a survey tool, this research applied the questionnaire ‘Motivated Strategies for Learning Questionnaire’ (MSLQ) to collect first year engineering student’s self-reported data of their cognitive abilities, motivational orientations and learning strategies. MSLQ is a widely-used questionnaire for assessment of university student’s self-regulated learning skills. The questionnaire was offered online as a part of the online-scaffolding-learning tools to develop student understanding of self-regulated learning theories and learning strategies. The online tools, which have been under development since 2015, are designed to help first-year students understand their self-regulated learning skill levels by providing prompt feedback after they complete the questionnaire. In addition, the online tool also supplies corresponding learning strategies to students if they want to improve specific learning skills. A total of 866 first year engineering students who enrolled in the first-year electrical engineering course were invited to participate in this research project. By the end of the course 857 students responded and 738 of their questionnaires were considered as valid questionnaires. Analysis of these surveys showed that 66% of the students thought the online-scaffolding-learning tools helped significantly to improve their self-regulated learning abilities. It was particularly pleasing that 16.4% of the respondents thought the online-scaffolding-learning tools were extremely effective. A current thrust of our research is to investigate the relationships between students’ self-regulated learning abilities and their academic performance. Our results are being used by the course instructors as they revise the curriculum and pedagogy for this fundamental first-year engineering course, but the general principles we have identified are applicable to most first-year STEM courses.

Keywords: academic preparedness, online-scaffolding-learning tool, self-regulated learning, STEM education

Procedia PDF Downloads 106
5422 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare

Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl

Abstract:

Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.

Keywords: average run length (ARL), bernoulli cusum (BC) chart, beta binomial posterior predictive (BBPP) distribution, clinical indicator (CI), healthcare organization (HCO), highest posterior density (HPD) interval

Procedia PDF Downloads 199
5421 I²C Master-Slave Integration

Authors: Rozita Borhan, Lam Kien Sieng

Abstract:

This paper describes I²C Slave implementation using I²C master obtained from the OpenCores website. This website provides free Verilog and VHDL Codes to users. The design implementation for the I²C slave is in Verilog Language and uses EDA tools for ASIC design known as ModelSim from Mentor Graphic. This tool is used for simulation and verification purposes. Common application for this I²C Master-Slave integration is also included. This paper also addresses the advantages and limitations of the said design.

Keywords: I²C, master, OpenCores, slave, Verilog, verification

Procedia PDF Downloads 435
5420 Optimized Renewable Energy Mix for Energy Saving in Waste Water Treatment Plants

Authors: J. D. García Espinel, Paula Pérez Sánchez, Carlos Egea Ruiz, Carlos Lardín Mifsut, Andrés López-Aranguren Oliver

Abstract:

This paper shortly describes three main actuations over a Waste Water Treatment Plant (WWTP) for reducing its energy consumption: Optimization of the biological reactor in the aeration stage by including new control algorithms and introducing new efficient equipment, the installation of an innovative hybrid system with zero Grid injection (formed by 100kW of PV energy and 5 kW of mini-wind energy generation) and an intelligent management system for load consumption and energy generation control in the most optimum way. This project called RENEWAT, involved in the European Commission call LIFE 2013, has the main objective of reducing the energy consumptions through different actions on the processes which take place in a WWTP and introducing renewable energies on these treatment plants, with the purpose of promoting the usage of treated waste water for irrigation and decreasing the C02 gas emissions. WWTP is always required before waste water can be reused for irrigation or discharged in water bodies. However, the energetic demand of the treatment process is high enough for making the price of treated water to exceed the one for drinkable water. This makes any policy very difficult to encourage the re-use of treated water, with a great impact on the water cycle, particularly in those areas suffering hydric stress or deficiency. The cost of treating waste water involves another climate-change related burden: the energy necessary for the process is obtained mainly from the electric network, which is, in most of the cases in Europe, energy obtained from the burning of fossil fuels. The innovative part of this project is based on the implementation, adaptation and integration of solutions for this problem, together with a new concept of the integration of energy input and operative energy demand. Moreover, there is an important qualitative jump between the technologies used and the alleged technologies to use in the project which give it an innovative character, due to the fact that there are no similar previous experiences of a WWTP including an intelligent discrimination of energy sources, integrating renewable ones (PV and Wind) and the grid.

Keywords: aeration system, biological reactor, CO2 emissions, energy efficiency, hybrid systems, LIFE 2013 call, process optimization, renewable energy sources, wasted water treatment plants

Procedia PDF Downloads 348
5419 Analysis the Impacts of WeChat Mobile Payment in China Teens' Online Purchasing Behaviors

Authors: Lok Yi Joyce Poon

Abstract:

China's mobile payment market has boomed in the past few years. WeChat (Chinese name as Weixin) owned by Tencent is known as the fastest growing all-in-one social messaging platforms. The company has launched the WeChat Pay in 2013, in which users can link their credit card to their user account and make payments within the app’s built in digital wallet. WeChat Payment is a one-stop payment tool that can provide a seamless online experience for the shoppers to transfer money between WeChat users (peer-to-peer) and make payments online by scanning a QR code, a prominent facilitator for transactions in WeChat, to complete the payment with the app without directing the users to the external websites. The aims of this study are to examine the effectiveness of WeChat mobile payment in China as well as the impacts of the China teen’s online purchasing behavior since the establishment of WeChat Payment. The research method of this study is conducted by both online survey on Sojump, a popular online survey platform in China. A total of 120 respondents among 18 to 25 teens in China completed the survey. Data sources included participants’ response to an end-of-session questionnaire, encompassing with the types of multiple choice, open-ended questions. To have an in-depth analysis, a face-to-face interview with a Chinese teen who is a frequent user of the WeChat Pay. The main finding of the study shows that the majority of the teenagers frequently use the WeChat payment tool because of its convenience, user-friendliness and the scenarios offered within the WeChat Wallet. The respondents claimed that they will settle the bills in their daily lives via WeChat Pay. However, the respondents in the age group of 40 or above will not use the WeChat Pay due to the security concern and they do not see the app as a platform for commercial activities like online shopping. Throughout the study, it is recommended WeChat should put more efforts on the security issue and improve the payment technology by adopting the near-field communication terminals instead of requiring users to scan QR codes before they complete the transaction.

Keywords: digital wallet, mobile payment, online purchasing behavior, WeChat Pay

Procedia PDF Downloads 386
5418 An Approximation Technique to Automate Tron

Authors: P. Jayashree, S. Rajkumar

Abstract:

With the trend of virtual and augmented reality environments booming to provide a life like experience, gaming is a major tool in supporting such learning environments. In this work, a variant of Voronoi heuristics, employing supervised learning for the TRON game is proposed. The paper discusses the features that would be really useful when a machine learning bot is to be used as an opponent against a human player. Various game scenarios, nature of the bot and the experimental results are provided for the proposed variant to prove that the approach is better than those that are currently followed.

Keywords: artificial Intelligence, automation, machine learning, TRON game, Voronoi heuristics

Procedia PDF Downloads 461