Search results for: heuristic procedures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1967

Search results for: heuristic procedures

1697 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 151
1696 OptiBaha: Design of a Web Based Analytical Tool for Enhancing Quality of Education at AlBaha University

Authors: Nadeem Hassan, Farooq Ahmad

Abstract:

The quality of education has a direct impact on individual, family, society, economy in general and the mankind as a whole. Because of that thousands of research papers and articles are written on the quality of education, billions of dollars are spent and continuously being spent on research and enhancing the quality of education. Academic programs accredited agencies define the various criterion of quality of education; academic institutions obtain accreditation from these agencies to ensure degree programs offered at their institution are of international standards. This R&D aims to build a web based analytical tool (OptiBaha) that finds the gaps in AlBaha University education system by taking input from stakeholders, including students, faculty, staff and management. The input/online-data collected by this tool will be analyzed on core areas of education as proposed by accredited agencies, CAC of ABET and NCAAA of KSA, including student background, language, culture, motivation, curriculum, teaching methodology, assessment and evaluation, performance and progress, facilities, availability of teaching materials, faculty qualification, monitoring, policies and procedures, and more. Based on different analytical reports, gaps will be highlighted, and remedial actions will be proposed. If the tool is implemented and made available through a continuous process the quality of education at AlBaha University can be enhanced, it will also help in fulfilling criterion of accreditation agencies. The tool will be generic in nature and ultimately can be used by any academic institution.

Keywords: academic quality, accreditation agencies, higher education, policies and procedures

Procedia PDF Downloads 279
1695 Evaluation of Cardiac Rhythm Patterns after Open Surgical Maze-Procedures from Three Years' Experiences in a Single Heart Center

Authors: J. Yan, B. Pieper, B. Bucsky, H. H. Sievers, B. Nasseri, S. A. Mohamed

Abstract:

In order to optimize the efficacy of medications, the regular follow-up with long-term continuous monitoring of heart rhythmic patterns has been facilitated since clinical introduction of cardiac implantable electronic monitoring devices (CIMD). Extensive analysis of rhythmic circadian properties is capable to disclose the distributions of arrhythmic events, which may support appropriate medication according rate-/rhythm-control strategy and minimize consequent afflictions. 348 patients (69 ± 0.5ys, male 61.8%) with predisposed atrial fibrillation (AF), undergoing primary ablating therapies combined to coronary or valve operations and secondary implantation of CIMDs, were involved and divided into 3 groups such as PAAF (paroxysmal AF) (n=99, male 68.7%), PEAF (persistent AF) (n=94, male 62.8%), and LSPEAF (long-standing persistent AF) (n=155, male 56.8%). All patients participated in three-year ambulant follow-up (3, 6, 9, 12, 18, 24, 30 and 36 months). Burdens of atrial fibrillation recurrence were assessed using cardiac monitor devices, whereby attacks frequencies and their circadian patterns were systemically analyzed. Anticoagulants and regular anti-arrhythmic medications were evaluated and the last were listed in terms of anti-rate and anti-rhythm regimens. Patients in the PEAF-group showed the least AF-burden after surgical ablating procedures compared to both of the other subtypes (p < 0.05). The AF-recurrences predominantly performed such attacks’ property as shorter than one hour, namely within 10 minutes (p < 0.05), regardless of AF-subtypes. Concerning circadian distribution of the recurrence attacks, frequent AF-attacks were mostly recorded in the morning in the PAAF-group (p < 0.05), while the patients with predisposed PEAF complained less attack-induced discomforts in the latter half of the night and the ones with LSPEAF only if they were not physically active after primary surgical ablations. Different AF-subtypes presented distinct therapeutic efficacies after appropriate surgical ablating procedures and recurrence properties in sense of circadian distribution. An optimization of medical regimen and drug dosages to maintain the therapeutic success needs more attention to detailed assessment of the long-term follow-up. Rate-control strategy plays a much more important role than rhythm-control in the ongoing follow-up examinations.

Keywords: atrial fibrillation, CIMD, MAZE, rate-control, rhythm-control, rhythm patterns

Procedia PDF Downloads 133
1694 From Bureaucracy to Organizational Learning Model: An Organizational Change Process Study

Authors: Vania Helena Tonussi Vidal, Ester Eliane Jeunon

Abstract:

This article aims to analyze the change processes of management related bureaucracy and learning organization model. The theoretical framework was based on Beer and Nohria (2001) model, identified as E and O Theory. Based on this theory the empirical research was conducted in connection with six key dimensions: goal, leadership, focus, process, reward systems and consulting. We used a case study of an educational Institution located in Barbacena, Minas Gerais. This traditional center of technical knowledge for long time adopted the bureaucratic way of management. After many changes in a business model, as the creation of graduate and undergraduate courses they decided to make a deep change in management model that is our research focus. The data were collected through semi-structured interviews with director, managers and courses supervisors. The analysis were processed by the procedures of Collective Subject Discourse (CSD) method, develop by Lefèvre & Lefèvre (2000), Results showed the incremental growing of management model toward a learning organization. Many impacts could be seeing. As negative factors we have: people resistance; poor information about the planning and implementation process; old politics inside the new model and so on. Positive impacts are: new procedures in human resources, mainly related to manager skills and empowerment; structure downsizing, open discussions channel; integrated information system. The process is still under construction and now great stimulus is done to managers and employee commitment in the process.

Keywords: bureaucracy, organizational learning, organizational change, E and O theory

Procedia PDF Downloads 410
1693 Quality Approaches for Mass-Produced Fashion: A Study in Malaysian Garment Manufacturing

Authors: N. J. M. Yusof, T. Sabir, J. McLoughlin

Abstract:

Garment manufacturing industry involves sequential processes that are subjected to uncontrollable variations. The industry depends on the skill of labour in handling the varieties of fabrics and accessories, machines, and also a complicated sewing operation. Due to these reasons, garment manufacturers created systems to monitor and control the product’s quality regularly by conducting quality approaches to minimize variation. The aims of this research were to ascertain the quality approaches deployed by Malaysian garment manufacturers in three key areas-quality systems and tools; quality control and types of inspection; sampling procedures chosen for garment inspection. The focus of this research also aimed to distinguish quality approaches used by companies that supplied the finished garments to both domestic and international markets. The feedback from each of company’s representatives was obtained using the online survey, which comprised of five sections and 44 questions on the organizational profile and quality approaches used in the garment industry. The results revealed that almost all companies had established their own mechanism of process control by conducting a series of quality inspection for daily production either it was formally been set up or vice versa. Quality inspection was the predominant quality control activity in the garment manufacturing and the level of complexity of these activities was substantially dictated by the customers. AQL-based sampling was utilized by companies dealing with the export market, whilst almost all the companies that only concentrated on the domestic market were comfortable using their own sampling procedures for garment inspection. This research provides an insight into the implementation of quality approaches that were perceived as important and useful in the garment manufacturing sector, which is truly labour-intensive.

Keywords: garment manufacturing, quality approaches, quality control, inspection, Acceptance Quality Limit (AQL), sampling

Procedia PDF Downloads 418
1692 HLB Disease Detection in Omani Lime Trees using Hyperspectral Imaging Based Techniques

Authors: Jacintha Menezes, Ramalingam Dharmalingam, Palaiahnakote Shivakumara

Abstract:

In the recent years, Omani acid lime cultivation and production has been affected by Citrus greening or Huanglongbing (HLB) disease. HLB disease is one of the most destructive diseases for citrus, with no remedies or countermeasures to stop the disease. Currently used Polymerase chain reaction (PCR) and enzyme-linked immunosorbent assay (ELISA) HLB detection tests require lengthy and labor-intensive laboratory procedures. Furthermore, the equipment and staff needed to carry out the laboratory procedures are frequently specialized hence making them a less optimal solution for the detection of the disease. The current research uses hyperspectral imaging technology for automatic detection of citrus trees with HLB disease. Omani citrus tree leaf images were captured through portable Specim IQ hyperspectral camera. The research considered healthy, nutrition deficient, and HLB infected leaf samples based on the Polymerase chain reaction (PCR) test. The highresolution image samples were sliced to into sub cubes. The sub cubes were further processed to obtain RGB images with spatial features. Similarly, RGB spectral slices were obtained through a moving window on the wavelength. The resized spectral-Spatial RGB images were given to Convolution Neural Networks for deep features extraction. The current research was able to classify a given sample to the appropriate class with 92.86% accuracy indicating the effectiveness of the proposed techniques. The significant bands with a difference in three types of leaves are found to be 560nm, 678nm, 726 nm and 750nm.

Keywords: huanglongbing (HLB), hyperspectral imaging (HSI), · omani citrus, CNN

Procedia PDF Downloads 52
1691 Particle Swarm Optimization Based Method for Minimum Initial Marking in Labeled Petri Nets

Authors: Hichem Kmimech, Achref Jabeur Telmoudi, Lotfi Nabli

Abstract:

The estimation of the initial marking minimum (MIM) is a crucial problem in labeled Petri nets. In the case of multiple choices, the search for the initial marking leads to a problem of optimization of the minimum allocation of resources with two constraints. The first concerns the firing sequence that could be legal on the initial marking with respect to the firing vector. The second deals with the total number of tokens that can be minimal. In this article, the MIM problem is solved by the meta-heuristic particle swarm optimization (PSO). The proposed approach presents the advantages of PSO to satisfy the two previous constraints and find all possible combinations of minimum initial marking with the best computing time. This method, more efficient than conventional ones, has an excellent impact on the resolution of the MIM problem. We prove through a set of definitions, lemmas, and examples, the effectiveness of our approach.

Keywords: marking, production system, labeled Petri nets, particle swarm optimization

Procedia PDF Downloads 151
1690 Occurrence of Foreign Matter in Food: Applied Identification Method - Association of Official Agricultural Chemists (AOAC) and Food and Drug Administration (FDA)

Authors: E. C. Mattos, V. S. M. G. Daros, R. Dal Col, A. L. Nascimento

Abstract:

The aim of this study is to present the results of a retrospective survey on the foreign matter found in foods analyzed at the Adolfo Lutz Institute, from July 2001 to July 2015. All the analyses were conducted according to the official methods described on Association of Official Agricultural Chemists (AOAC) for the micro analytical procedures and Food and Drug Administration (FDA) for the macro analytical procedures. The results showed flours, cereals and derivatives such as baking and pasta products were the types of food where foreign matters were found more frequently followed by condiments and teas. Fragments of stored grains insects, its larvae, nets, excrement, dead mites and rodent excrement were the most foreign matter found in food. Besides, foreign matters that can cause a physical risk to the consumer’s health such as metal, stones, glass, wood were found but rarely. Miscellaneous (shell, sand, dirt and seeds) were also reported. There are a lot of extraneous materials that are considered unavoidable since are something inherent to the product itself, such as insect fragments in grains. In contrast, there are avoidable extraneous materials that are less tolerated because it is preventable with the Good Manufacturing Practice. The conclusion of this work is that although most extraneous materials found in food are considered unavoidable it is necessary to keep the Good Manufacturing Practice throughout the food processing as well as maintaining a constant surveillance of the production process in order to avoid accidents that may lead to occurrence of these extraneous materials in food.

Keywords: extraneous materials, food contamination, foreign matter, surveillance

Procedia PDF Downloads 337
1689 The Professional Rehabilitation of Workers Affected by Chronic Low Back Pain in 'Baixada Santista' Region, Brazil

Authors: Maria Do Carmo Baracho De Alencar

Abstract:

Back pain is considered a worldwide public health problem and has led to numerous work-related absence from work and public spending on rehabilitation, as well as difficulties in the process of professional rehabilitation and return to work. Also, the rehabilitation of workers is one of the great challenges today and for the field of Workers' Health in Brazil. Aim: To investigate the procedures related to the professional rehabilitation of insured workers affected by chronic low back pain, based on the perceptions of professional counselors. Methods: A list of related professional counselors was obtained from the Professional Rehabilitation Coordination of the Baixada Santista (SP) region, and from the Social Security National Institute of Brazil, and in which cities they worked. Semistructured and individual interview was scheduled, based on a pre-elaborated script, containing questions about procedures, experiences at work and feelings. The interviews were recorded and transcribed in full for content analysis. Results: Ten (10) professional counselors of both genders and from nine (9) cities from the Baixada Santista region participated in the study. Aged between 31 and 64 years, and time in service between 4 and 38 years. Only one of the professionals was graduaded in Psychology. Among the testimonies emerged the high demand of work, the lack of interest of companies, medical authority, the social helplessness after rehabilitation process, difficulty in assessing invisible pain, and suffering, anguish, and frustration at work, between others. Conclusion: The study contributes to reflections about the importance of interdisciplinary actions and the Psychology in the processes of professional rehabilitation and readaptation in the process of return to work.

Keywords: low back pain, rehabilitation, work, occupational health

Procedia PDF Downloads 111
1688 A New Tool for Global Optimization Problems: Cuttlefish Algorithm

Authors: Adel Sabry Eesa, Adnan Mohsin Abdulazeez Brifcani, Zeynep Orman

Abstract:

This paper presents a new meta-heuristic bio-inspired optimization algorithm which is called Cuttlefish Algorithm (CFA). The algorithm mimics the mechanism of color changing behavior of the cuttlefish to solve numerical global optimization problems. The colors and patterns of the cuttlefish are produced by reflected light from three different layers of cells. The proposed algorithm considers mainly two processes: reflection and visibility. Reflection process simulates light reflection mechanism used by these layers, while visibility process simulates visibility of matching patterns of the cuttlefish. To show the effectiveness of the algorithm, it is tested with some other popular bio-inspired optimization algorithms such as Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Bees Algorithm (BA) that have been previously proposed in the literature. Simulations and obtained results indicate that the proposed CFA is superior when compared with these algorithms.

Keywords: Cuttlefish Algorithm, bio-inspired algorithms, optimization, global optimization problems

Procedia PDF Downloads 539
1687 Digital Transformation in Fashion System Design: Tools and Opportunities

Authors: Margherita Tufarelli, Leonardo Giliberti, Elena Pucci

Abstract:

The fashion industry's interest in virtuality is linked, on the one hand, to the emotional and immersive possibilities of digital resources and the resulting languages and, on the other, to the greater efficiency that can be achieved throughout the value chain. The interaction between digital innovation and deep-rooted manufacturing traditions today translates into a paradigm shift for the entire fashion industry where, for example, the traditional values of industrial secrecy and know-how give way to experimentation in an open as well as participatory way, and the complete emancipation of virtual reality from actual 'reality'. The contribution aims to investigate the theme of digitisation in the Italian fashion industry, analysing its opportunities and the criticalities that have hindered its diffusion. There are two reasons why the most common approach in the fashion sector is still analogue: (i) the fashion product lives in close contact with the human body, so the sensory perception of materials plays a central role in both the use and the design of the product, but current technology is not able to restore the sense of touch; (ii) volumes are obtained by stitching flat surfaces that once assembled, given the flexibility of the material, can assume almost infinite configurations. Managing the fit and styling of virtual garments involves a wide range of factors, including mechanical simulation, collision detection, and user interface techniques for garment creation. After briefly reviewing some of the salient historical milestones in the resolution of problems related to the digital simulation of deformable materials and the user interface for the procedures for the realisation of the clothing system, the paper will describe the operation and possibilities offered today by the latest generation of specialised software. Parametric avatars and digital sartorial approach; drawing tools optimised for pattern making; materials both from the point of view of simulated physical behaviour and of aesthetic performance, tools for checking wearability, renderings, but also tools and procedures useful to companies both for dialogue with prototyping software and machinery and for managing the archive and the variants to be made. The article demonstrates how developments in technology and digital procedures now make it possible to intervene in different stages of design in the fashion industry. An integrated and additive process in which the constructed 3D models are usable both in the prototyping and communication of physical products and in the possible exclusively digital uses of 3D models in the new generation of virtual spaces. Mastering such tools requires the acquisition of specific digital skills and, at the same time, traditional skills for the design of the clothing system, but the benefits are manifold and applicable to different business dimensions. We are only at the beginning of the global digital transformation: the emergence of new professional figures and design dynamics leaves room for imagination, but in addition to applying digital tools to traditional procedures, traditional fashion know-how needs to be transferred into emerging digital practices to ensure the continuity of the technical-cultural heritage beyond the transformation.

Keywords: digital fashion, digital technology and couture, digital fashion communication, 3D garment simulation

Procedia PDF Downloads 48
1686 Discretization of Cuckoo Optimization Algorithm for Solving Quadratic Assignment Problems

Authors: Elham Kazemi

Abstract:

Quadratic Assignment Problem (QAP) is one the combinatorial optimization problems about which research has been done in many companies for allocating some facilities to some locations. The issue of particular importance in this process is the costs of this allocation and the attempt in this problem is to minimize this group of costs. Since the QAP’s are from NP-hard problem, they cannot be solved by exact solution methods. Cuckoo Optimization Algorithm is a Meta-heuristicmethod which has higher capability to find the global optimal points. It is an algorithm which is basically raised to search a continuous space. The Quadratic Assignment Problem is the issue which can be solved in the discrete space, thus the standard arithmetic operators of Cuckoo Optimization Algorithm need to be redefined on the discrete space in order to apply the Cuckoo Optimization Algorithm on the discrete searching space. This paper represents the way of discretizing the Cuckoo optimization algorithm for solving the quadratic assignment problem.

Keywords: Quadratic Assignment Problem (QAP), Discrete Cuckoo Optimization Algorithm (DCOA), meta-heuristic algorithms, optimization algorithms

Procedia PDF Downloads 488
1685 Optimizing Network Latency with Fast Path Assignment for Incoming Flows

Authors: Qing Lyu, Hang Zhu

Abstract:

Various flows in the network require to go through different types of middlebox. The improper placement of network middlebox and path assignment for flows could greatly increase the network latency and also decrease the performance of network. Minimizing the total end to end latency of all the ows requires to assign path for the incoming flows. In this paper, the flow path assignment problem in regard to the placement of various kinds of middlebox is studied. The flow path assignment problem is formulated to a linear programming problem, which is very time consuming. On the other hand, a naive greedy algorithm is studied. Which is very fast but causes much more latency than the linear programming algorithm. At last, the paper presents a heuristic algorithm named FPA, which takes bottleneck link information and estimated bandwidth occupancy into consideration, and achieves near optimal latency in much less time. Evaluation results validate the effectiveness of the proposed algorithm.

Keywords: flow path, latency, middlebox, network

Procedia PDF Downloads 183
1684 Core Number Optimization Based Scheduler to Order/Mapp Simulink Application

Authors: Asma Rebaya, Imen Amari, Kaouther Gasmi, Salem Hasnaoui

Abstract:

Over these last years, the number of cores witnessed a spectacular increase in digital signal and general use processors. Concurrently, significant researches are done to get benefit from the high degree of parallelism. Indeed, these researches are focused to provide an efficient scheduling from hardware/software systems to multicores architecture. The scheduling process consists on statically choose one core to execute one task and to specify an execution order for the application tasks. In this paper, we describe an efficient scheduler that calculates the optimal number of cores required to schedule an application, gives a heuristic scheduling solution and evaluates its cost. Our proposal results are evaluated and compared with Preesm scheduler results and we prove that ours allows better scheduling in terms of latency, computation time and number of cores.

Keywords: computation time, hardware/software system, latency, optimization, multi-cores platform, scheduling

Procedia PDF Downloads 257
1683 A Comprehensive Study on Quality Assurance in Game Development

Authors: Maria Komal, Zaineb Khalil, Mehreen Sirshar

Abstract:

Due to the recent technological advancements, Games have become one of the most demanding applications. Gaming industry is rapidly growing and the key to success in this industry is the development of good quality games, which is a highly competitive issue. The ultimate goal of game developers is to provide player’s satisfaction by developing high-quality games. This research is the comprehensive survey of techniques followed by game industries to ensure games quality. After analysis of various techniques, it has been found that quality simulation according to ISO standards and play test methods are used to ensure games quality. Because game development requires cross-disciplined team, an increasing trend towards distributed game development has been observed. This paper evaluates the strengths and weaknesses of current methodologies used in game industry and draws a conclusion. We have also proposed quality parameters which can be used as a heuristic framework to identify those attributes which have high testing priorities.

Keywords: game development, computer games, video games, gaming industry, quality assurance, playability, user experience

Procedia PDF Downloads 504
1682 Speedup Breadth-First Search by Graph Ordering

Authors: Qiuyi Lyu, Bin Gong

Abstract:

Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.

Keywords: breadth-first search, BFS, graph ordering, graph algorithm

Procedia PDF Downloads 112
1681 An Analytical Approach of Computational Complexity for the Method of Multifluid Modelling

Authors: A. K. Borah, A. K. Singh

Abstract:

In this paper we deal building blocks of the computer simulation of the multiphase flows. Whole simulation procedure can be viewed as two super procedures; The implementation of VOF method and the solution of Navier Stoke’s Equation. Moreover, a sequential code for a Navier Stoke’s solver has been studied.

Keywords: Bi-conjugate gradient stabilized (Bi-CGSTAB), ILUT function, krylov subspace, multifluid flows preconditioner, simple algorithm

Procedia PDF Downloads 504
1680 Preference Aggregation and Mechanism Design in the Smart Grid

Authors: Zaid Jamal Saeed Almahmoud

Abstract:

Smart Grid is the vision of the future power system that combines advanced monitoring and communication technologies to provide energy in a smart, efficient, and user-friendly manner. This proposal considers a demand response model in the Smart Grid based on utility maximization. Given a set of consumers with conflicting preferences in terms of consumption and a utility company that aims to minimize the peak demand and match demand to supply, we study the problem of aggregating these preferences while modelling the problem as a game. We also investigate whether an equilibrium can be reached to maximize the social benefit. Based on such equilibrium, we propose a dynamic pricing heuristic that computes the equilibrium and sets the prices accordingly. The developed approach was analysed theoretically and evaluated experimentally using real appliances data. The results show that our proposed approach achieves a substantial reduction in the overall energy consumption.

Keywords: heuristics, smart grid, aggregation, mechanism design, equilibrium

Procedia PDF Downloads 84
1679 The Application of Bayesian Heuristic for Scheduling in Real-Time Private Clouds

Authors: Sahar Sohrabi

Abstract:

The emergence of Cloud data centers has revolutionized the IT industry. Private Clouds in specific provide Cloud services for certain group of customers/businesses. In a real-time private Cloud each task that is given to the system has a deadline that desirably should not be violated. Scheduling tasks in a real-time private CLoud determine the way available resources in the system are shared among incoming tasks. The aim of the scheduling policy is to optimize the system outcome which for a real-time private Cloud can include: energy consumption, deadline violation, execution time and the number of host switches. Different scheduling policies can be used for scheduling. Each lead to a sub-optimal outcome in a certain settings of the system. A Bayesian Scheduling strategy is proposed for scheduling to further improve the system outcome. The Bayesian strategy showed to outperform all selected policies. It also has the flexibility in dealing with complex pattern of incoming task and has the ability to adapt.

Keywords: cloud computing, scheduling, real-time private cloud, bayesian

Procedia PDF Downloads 336
1678 Implementation and Design of Fuzzy Controller for High Performance Dc-Dc Boost Converters

Authors: A. Mansouri, F. Krim

Abstract:

This paper discusses the implementation and design of both linear PI and fuzzy controllers for DC-DC boost converters. Design of PI controllers is based on temporal response of closed-loop converters, while fuzzy controllers design is based on heuristic knowledge of boost converters. Linear controller implementation is quite straightforward relying on mathematical models, while fuzzy controller implementation employs one or more artificial intelligences techniques. Comparison between these boost controllers is made in design aspect. Experimental results show that the proposed fuzzy controller system is robust against input voltage and load resistance changing and in respect of start-up transient. Results indicate that fuzzy controller can achieve best control performance concerning faster transient response, steady-state response good stability and accuracy under different operating conditions. Fuzzy controller is more suitable to control boost converters.

Keywords: boost DC-DC converter, fuzzy, PI controllers, power electronics and control system

Procedia PDF Downloads 445
1677 Use of Triclosan-Coated Sutures Led to Cost Saving in Public and Private Setting in India across Five Surgical Categories: An Economical Model Assessment

Authors: Anish Desai, Reshmi Pillai, Nilesh Mahajan, Hitesh Chopra, Vishal Mahajan, Ajay Grover, Ashish Kohli

Abstract:

Surgical Site Infection (SSI) is hospital acquired infection of growing concern. This study presents the efficacy and cost-effectiveness of triclosan-coated suture, in reducing the burden of SSI in India. Methodology: A systematic literature search was conducted for economic burden (1998-2018) of SSI and efficacy of triclosan-coated sutures (TCS) vs. non-coated sutures (NCS) (2000-2018). PubMed Medline and EMBASE indexed articles were searched using Mesh terms or Emtree. Decision tree analysis was used to calculate, the cost difference between TCS and NCS at private and public hospitals, respectively for 7 surgical procedures. Results: The SSI range from low to high for Caesarean section (C-section), Laparoscopic hysterectomy (L-hysterectomy), Open Hernia (O-Hernia), Laparoscopic Cholecystectomy (L-Cholecystectomy), Coronary artery bypass graft (CABG), Total knee replacement (TKR), and Mastectomy were (3.77 to 24.2%), (2.28 to 11.7%), (1.75 to 60%), (1.71 to 25.58%), (1.6 to 18.86%), (1.74 to 12.5%), and (5.56 to 25%), respectively. The incremental cost (%) of TCS ranged 0.1%-0.01% in private and from 0.9%-0.09% at public hospitals across all surgical procedures. Cost savings at median efficacy & SSI risk was 6.52%, 5.07 %, 11.39%, 9.63%, 3.62%, 2.71%, 9.41% for C-section, L-hysterectomy, O-Hernia, L-Cholecystectomy, CABG, TKR, and Mastectomy in private and 8.79%, 4.99%, 12.67%, 10.58%, 3.32%, 2.35%, 11.83% in public hospital, respectively. Efficacy of TCS and SSI incidence in a particular surgical procedure were important determinants of cost savings using one-way sensitivity analysis. Conclusion: TCS suture led to cost savings across all 7 surgeries in both private and public hospitals in India.

Keywords: cost Savings, non-coated sutures, surgical site infection, triclosan-coated sutures

Procedia PDF Downloads 373
1676 X̄ and S Control Charts based on Weighted Standard Deviation Method

Authors: Derya Karagöz

Abstract:

A Shewhart chart based on normality assumption is not appropriate for skewed distributions since its Type-I error rate is inflated. This study presents X̄ and S control charts for monitoring the process variability for skewed distributions. We propose Weighted Standard Deviation (WSD) X̄ and S control charts. Standard deviation estimator is applied to monitor the process variability for estimating the process standard deviation, in the case of the W SD X̄ and S control charts as this estimator is simple and easy to compute. Unlike the Shewhart control chart, the proposed charts provide asymmetric limits in accordance with the direction and degree of skewness to construct the upper and lower limits. The performances of the proposed charts are compared with other heuristic charts for skewed distributions by using Simulation study. The Simulation studies show that the proposed control charts have good properties for skewed distributions and large sample sizes.

Keywords: weighted standard deviation, MAD, skewed distributions, S control charts

Procedia PDF Downloads 373
1675 A Case-Study Analysis on the Necessity of Testing for Cyber Risk Mitigation on Maritime Transport

Authors: Polychronis Kapalidis

Abstract:

In recent years, researchers have started to turn their attention to cyber security and maritime security independently, neglecting, in most cases, to examine the areas where these two critical issues are intertwined. The impact of cybersecurity issues on the maritime economy is emerging dramatically. Maritime transport and all related activities are conducted by technology-intensive platforms, which today rely heavily on information systems. The paper’s argument is that when no defense is completely effective against cyber attacks, it is vital to test responses to the inevitable incursions. Hence, preparedness in the form of testing existing cybersecurity structure via different tools for potential attacks is vital for minimizing risks. Traditional criminal activities may further be facilitated and evolved through the misuse of cyberspace. Kidnap, piracy, fraud, theft of cargo and imposition of ransomware are the major of these activities that mainly target the industry’s most valuable asset; the ship. The paper, adopting a case-study analysis, based on stakeholder consultation and secondary data analysis, namely policy and strategic-related documentation, presents the importance of holistic testing in the sector. Arguing that poor understanding of the issue leads to the adoption of ineffective policies the paper will present the level of awareness within the industry and assess the risks and vulnerabilities of ships to these cybercriminal activities. It will conclude by suggesting that testing procedures must be focused on three main pillars within the maritime transport sector: the human factor, the infrastructure, and the procedures.

Keywords: cybercrime, cybersecurity, organized crime, risk mitigation

Procedia PDF Downloads 134
1674 Distribution Planning with Renewable Energy Units Based on Improved Honey Bee Mating Optimization

Authors: Noradin Ghadimi, Nima Amjady, Oveis Abedinia, Roza Poursoleiman

Abstract:

This paper proposed an Improved Honey Bee Mating Optimization (IHBMO) for a planning paradigm for network upgrade. The proposed technique is a new meta-heuristic algorithm which inspired by mating of the honey bee. The paradigm is able to select amongst several choices equi-cost one assuring the optimum in terms of voltage profile, considering various scenarios of DG penetration and load demand. The distributed generation (DG) has created a challenge and an opportunity for developing various novel technologies in power generation. DG prepares a multitude of services to utilities and consumers, containing standby generation, peaks chopping sufficiency, base load generation. The proposed algorithm is applied over the 30 lines, 28 buses power system. The achieved results demonstrate the good efficiency of the DG using the proposed technique in different scenarios.

Keywords: distributed generation, IHBMO, renewable energy units, network upgrade

Procedia PDF Downloads 466
1673 Air Cargo Overbooking Model under Stochastic Weight and Volume Cancellation

Authors: Naragain Phumchusri, Krisada Roekdethawesab, Manoj Lohatepanont

Abstract:

Overbooking is an approach of selling more goods or services than available capacities because sellers anticipate that some buyers will not show-up or may cancel their bookings. At present, many airlines deploy overbooking strategy in order to deal with the uncertainty of their customers. Particularly, some airlines sell more cargo capacity than what they have available to freight forwarders with beliefs that some of them will cancel later. In this paper, we propose methods to find the optimal overbooking level of volume and weight for air cargo in order to minimize the total cost, containing cost of spoilage and cost of offloaded. Cancellations of volume and weight are jointly random variables with a known joint distribution. Heuristic approaches applying the idea of weight and volume independency is considered to find an appropriate answer to the full problem. Computational experiments are used to explore the performance of approaches presented in this paper, as compared to a naïve method under different scenarios.

Keywords: air cargo overbooking, offloading capacity, optimal overbooking level, revenue management, spoilage capacity

Procedia PDF Downloads 303
1672 A Comparison of Sequential Quadratic Programming, Genetic Algorithm, Simulated Annealing, Particle Swarm Optimization for the Design and Optimization of a Beam Column

Authors: Nima Khosravi

Abstract:

This paper describes an integrated optimization technique with concurrent use of sequential quadratic programming, genetic algorithm, and simulated annealing particle swarm optimization for the design and optimization of a beam column. In this research, the comparison between 4 different types of optimization methods. The comparison is done and it is found out that all the methods meet the required constraints and the lowest value of the objective function is achieved by SQP, which was also the fastest optimizer to produce the results. SQP is a gradient based optimizer hence its results are usually the same after every run. The only thing which affects the results is the initial conditions given. The initial conditions given in the various test run were very large as compared. Hence, the value converged at a different point. Rest of the methods is a heuristic method which provides different values for different runs even if every parameter is kept constant.

Keywords: beam column, genetic algorithm, particle swarm optimization, sequential quadratic programming, simulated annealing

Procedia PDF Downloads 365
1671 Joint Optimization of Carsharing Stations with Vehicle Relocation and Demand Selection

Authors: Jiayuan Wu. Lu Hu

Abstract:

With the development of the sharing economy and mobile technology, carsharing becomes more popular. In this paper, we focus on the joint optimization of one-way station-based carsharing systems. We model the problem as an integer linear program with six elements: station locations, station capacity, fleet size, initial vehicle allocation, vehicle relocation, and demand selection. A greedy-based heuristic is proposed to address the model. Firstly, initialization based on the location variables relaxation using Gurobi solver is conducted. Then, according to the profit margin and demand satisfaction of each station, the number of stations is downsized iteratively. This method is applied to real data from Chengdu, Sichuan taxi data, and it’s efficient when dealing with a large scale of candidate stations. The result shows that with vehicle relocation and demand selection, the profit and demand satisfaction of carsharing systems are increased.

Keywords: one-way carsharing, location, vehicle relocation, demand selection, greedy algorithm

Procedia PDF Downloads 110
1670 Enunciation on Complexities of Selected Tree Searching Algorithms

Authors: Parag Bhalchandra, S. D. Khamitkar

Abstract:

Searching trees is a most interesting application of Artificial Intelligence. Over the period of time, many innovative methods have been evolved to better search trees with respect to computational complexities. Tree searches are difficult to understand due to the exponential growth of possibilities when increasing the number of nodes or levels in the tree. Usually it is understood when we traverse down in the tree, traverse down to greater depth, in the search of a solution or a goal. However, this does not happen in reality as explicit enumeration is not a very efficient method and there are many algorithmic speedups that will find the optimal solution without the burden of evaluating all possible trees. It was a common question before all researchers where they often wonder what algorithms will yield the best and fastest result The intention of this paper is two folds, one to review selected tree search algorithms and search strategies that can be applied to a problem space and the second objective is to stimulate to implement recent developments in the complexity behavior of search strategies. The algorithms discussed here apply in general to both brute force and heuristic searches.

Keywords: trees search, asymptotic complexity, brute force, heuristics algorithms

Procedia PDF Downloads 287
1669 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 531
1668 Qualitative Measurement of Literacy

Authors: Indrajit Ghosh, Jaydip Roy

Abstract:

Literacy rate is an important indicator for measurement of human development. But this is not a good one to capture the qualitative dimension of educational attainment of an individual or a society. The overall educational level of an area is an important issue beyond the literacy rate. The overall educational level can be thought of as an outcome of the educational levels of individuals. But there is no well-defined algorithm and mathematical model available to measure the overall educational level of an area. A heuristic approach based on accumulated experience of experts is effective one. It is evident that fuzzy logic offers a natural and convenient framework in modeling various concepts in social science domain. This work suggests the implementation of fuzzy logic to develop a mathematical model for measurement of educational attainment of an area in terms of Education Index. The contribution of the study is two folds: conceptualization of “Education Profile” and proposing a new mathematical model to measure educational attainment in terms of “Education Index”.

Keywords: education index, education profile, fuzzy logic, literacy

Procedia PDF Downloads 296