Search results for: fractional programming
440 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry
Authors: Salami Akeem Olanrewaju
Abstract:
The transportation models or problems are primarily concerned with the optimal (best possible) way in which a product produced at different factories or plants (called supply origins) can be transported to a number of warehouses or customers (called demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transport cost in order to maximum profit. Data were gathered from the records of the Distribution Department of 7-Up Bottling Company Plc. Ilorin, Kwara State, Nigeria. The data were analyzed using SPSS (Statistical Package for Social Sciences) while applying the three methods of solving a transportation problem. The three methods produced the same results; therefore, any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost.Keywords: cost minimization, resources utilization, distribution system, allocation problem
Procedia PDF Downloads 258439 Model of Production and Marketing Strategies in Alignment with Business Strategy using QFD Approach
Authors: Hamed Saremi, Suzan Taghavy, Shahla Saremi
Abstract:
In today's competitive world, organizations are expected to surpass the competitors and benefit from the resources and benefits. Therefore, organizations need to improve the current performance is felt more than ever that this requires to identify organizational optimal strategies, and consider all strategies simultaneously. In this study, to enhance competitive advantage and according to customer requirements, alignment between business, production and marketing strategies, House of Quality (QFD) approach has been used and zero-one linear programming model has been studied. First, the alignment between production and marketing strategies with business strategy, independent weights of these strategies is calculated. Then with using QFD approach the aligned weights of optimal strategies in each production and marketing field will be obtained and finally the aligned marketing strategies selection with the purpose of allocating budget and specialist human resource to marketing functions will be done that lead to increasing competitive advantage and benefit.Keywords: strategy alignment, house of quality deployment, production strategy, marketing strategy, business strategy
Procedia PDF Downloads 435438 Petrogenesis of the Neoproterozoic Rocks of Megele Area, Asosa, Western Ethiopia
Authors: Temesgen Oljira, Olugbenga Akindeji Okunlola, Akinade Shadrach Olatunji, Dereje Ayalew, Bekele Ayele Bedada
Abstract:
The Western Ethiopian Shield (WES) is underlain by volcano-sedimentary terranes, gneissic terranes, and ophiolitic rocks intruded by different granitoid bodies. For the past few years, Neoproterozoic rocks of the Megele area in the western part of the WES have been explored. Understanding the geology of the area and assessing the mineralized area's economic potential requires petrological, geochemical, and geological characterization of the Neoproterozoic granitoids and associated metavolcanic rocks. Thus, the geological, geochemical, and petrogenetic features of Neoproterozoic granitoids and associated metavolcanic rocks were elucidated using a combination of field mapping, petrological, and geochemical study. The Megele area is part of a low-grade volcano-sedimentary zone that has been intruded by mafic (dolerite dyke) and granitoid intrusions (granodiorite, diorite, granite gneiss). The granodiorite, associated diorite, and granite gneiss are calc-alkaline, peraluminous to slightly metaluminous, S-type granitoids formed in volcanic arc subduction (VAG) to syn-collisional (syn-COLD) tectonic setting by fractionation of LREE-enriched, HREE-depleted basaltic magma with considerable crustal input. While the metabasalt is sub-alkaline (tholeiitic), metaluminous bodies are generated at the mid-oceanic ridge tectonic setting by partially melting HREE-depleted and LREE-enriched basaltic magma. The reworking of sediment-loaded crustal blocks at depth in a subduction zone resulted in the production of S-type granitoids. This basaltic magma was supplied from an LREE-enriched, HREE-depleted mantle.Keywords: fractional crystallization, geochemistry, Megele, petrogenesis, s-type granite
Procedia PDF Downloads 131437 Purification of Zr from Zr-Hf Resources Using Crystallization in HF-HCl Solvent Mixture
Authors: Kenichi Hirota, Jifeng Wang, Sadao Araki, Koji Endo, Hideki Yamamoto
Abstract:
Zirconium (Zr) has been used as a fuel cladding tube for nuclear reactors, because of the excellent corrosion resistance and the low adsorptive material for neutron. Generally speaking, the natural resource of Zr is often containing Hf that has similar properties. The content of Hf in the Zr resources is about 2~4 wt%. In the industrial use, the content of Hf in Zr resources should be lower than the 100 ppm. However, the separation of Zr and Hf is not so easy, because of similar chemical and physical properties such as melting point, boiling point and things. Solvent extraction method has been applied for the separation of Zr and Hf from Zr natural resources. This method can separate Hf with high efficiency (Hf < 100ppm), however, it needs much amount of organic solvents for solvent extraction and the cost of its disposal treatment is high. Therefore, we attached attention for the fractional crystallization. This separation method depends on the solubility difference of Zr and Hf in the solvent. In this work, hexafluorozirconate (hafnate) (K2Zr(Hf)F6) was used as model compound. Solubility of K2ZrF6 in water showed lower than that of K2HfF6. By repeating of this treatment, it is possible to purify Zr, practically. In this case, 16-18 times of recrystallization stages were needed for its high purification. The improvement of the crystallization process was carried out in this work. Water, hydrofluoric acid (HF) and hydrofluoric acid (HF) +hydrochloric acid (HCl) mixture were chosen as solvent for dissolution of Zr and Hf. In the experiment, 10g of K2ZrF6 was added to each solvent of 100mL. Each solution was heated for 1 hour at 353K. After 1h of this operation, they were cooled down till 293K, and were held for 5 hours at 273K. Concentration of Zr or Hf was measured using ICP analysis. It was found that Hf was separated from Zr-Hf mixed compound with high efficiency, when HF-HCl solution was used for solvent of crystallization. From the comparison of the particle size of each crystal by SEM, it was confirmed that the particle diameter of the crystal showed smaller size with decreasing of Hf content. This paper concerned with purification of Zr from Zr-Hf mixture using crystallization method.Keywords: crystallization, zirconium, hafnium, separation
Procedia PDF Downloads 438436 The Museum of Museums: A Mobile Augmented Reality Application
Authors: Qian Jin
Abstract:
Museums have been using interactive technology to spark visitor interest and improve understanding. These technologies can play a crucial role in helping visitors understand more about an exhibition site by using multimedia to provide information. Google Arts and Culture and Smartify are two very successful digital heritage products. They used mobile augmented reality to visualise the museum's 3D models and heritage images but did not include 3D models of the collection and audio information. In this research, service-oriented mobile augmented reality application was developed for users to access collections from multiple museums(including V and A, the British Museum, and British Library). The third-party API (Application Programming Interface) is requested to collect metadata (including images, 3D models, videos, and text) of three museums' collections. The acquired content is then visualized in AR environments. This product will help users who cannot visit the museum offline due to various reasons (inconvenience of transportation, physical disability, time schedule).Keywords: digital heritage, argument reality, museum, flutter, ARcore
Procedia PDF Downloads 79435 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing
Procedia PDF Downloads 177434 Implementing of Indoor Air Quality Index in Hong Kong
Authors: Kwok W. Mui, Ling T. Wong, Tsz W. Tsang
Abstract:
Many Hong Kong people nowadays spend most of their lifetime working indoor. Since poor Indoor Air Quality (IAQ) potentially leads to discomfort, ill health, low productivity and even absenteeism in workplaces, a call for establishing statutory IAQ control to safeguard the well-being of residents is urgently required. Although policies, strategies, and guidelines for workplace IAQ diagnosis have been developed elsewhere and followed with remedial works, some of those workplaces or buildings have relatively late stage of the IAQ problems when the investigation or remedial work started. Screening for IAQ problems should be initiated as it will provide information as a minimum provision of IAQ baseline requisite to the resolution of the problems. It is not practical to sample all air pollutants that exit. Nevertheless, as a statutory control, reliable, rapid screening is essential in accordance with a compromise strategy, which balances costs against detection of key pollutants. This study investigates the feasibility of using an IAQ index as a parameter of IAQ control in Hong Kong. The index is a screening parameter to identify the unsatisfactory workplace IAQ and will highlight where a fully effective IAQ monitoring and assessment is needed for an intensive diagnosis. There already exist a number of representative common indoor pollutants based on some extensive IAQ assessments. The selection of pollutants is surrogate to IAQ control consists of dilution, mitigation, and emission control. The IAQ Index and assessment will look at high fractional quantities of these common measurement parameters. With the support of the existing comprehensive regional IAQ database and the IAQ Index by the research team as the pre-assessment probability, and the unsatisfactory IAQ prevalence as the post-assessment probability from this study, thresholds of maintaining the current measures and performing a further IAQ test or IAQ remedial measures will be proposed. With justified resources, the proposed IAQ Index and assessment protocol might be a useful tool for setting up a practical public IAQ surveillance programme and policy in Hong Kong.Keywords: assessment, index, indoor air quality, surveillance programme
Procedia PDF Downloads 268433 Competence-Based Human Resources Selection and Training: Making Decisions
Authors: O. Starineca, I. Voronchuk
Abstract:
Human Resources (HR) selection and training have various implementation possibilities depending on an organization’s abilities and peculiarities. We propose to base HR selection and training decisions about on a competence-based approach. HR selection and training of employees are topical as there is room for improvement in this field; therefore, the aim of the research is to propose rational decision-making approaches for an organization HR selection and training choice. Our proposals are based on the training development and competence-based selection approaches created within previous researches i.e. Analytic-Hierarchy Process (AHP) and Linear Programming. Literature review on non-formal education, competence-based selection, AHP form our theoretical background. Some educational service providers in Latvia offer employees training, e.g. motivation, computer skills, accounting, law, ethics, stress management, etc. that are topical for Public Administration. Competence-based approach is a rational base for rational decision-making in both HR selection and considering HR training.Keywords: competence-based selection, human resource, training, decision-making
Procedia PDF Downloads 338432 Multi-Criteria Decision Making Network Optimization for Green Supply Chains
Authors: Bandar A. Alkhayyal
Abstract:
Modern supply chains are typically linear, transforming virgin raw materials into products for end consumers, who then discard them after use to landfills or incinerators. Nowadays, there are major efforts underway to create a circular economy to reduce non-renewable resource use and waste. One important aspect of these efforts is the development of Green Supply Chain (GSC) systems which enables a reverse flow of used products from consumers back to manufacturers, where they can be refurbished or remanufactured, to both economic and environmental benefit. This paper develops novel multi-objective optimization models to inform GSC system design at multiple levels: (1) strategic planning of facility location and transportation logistics; (2) tactical planning of optimal pricing; and (3) policy planning to account for potential valuation of GSC emissions. First, physical linear programming was applied to evaluate GSC facility placement by determining the quantities of end-of-life products for transport from candidate collection centers to remanufacturing facilities while satisfying cost and capacity criteria. Second, disassembly and remanufacturing processes have received little attention in industrial engineering and process cost modeling literature. The increasing scale of remanufacturing operations, worth nearly $50 billion annually in the United States alone, have made GSC pricing an important subject of research. A non-linear physical programming model for optimization of pricing policy for remanufactured products that maximizes total profit and minimizes product recovery costs were examined and solved. Finally, a deterministic equilibrium model was used to determine the effects of internalizing a cost of GSC greenhouse gas (GHG) emissions into optimization models. Changes in optimal facility use, transportation logistics, and pricing/profit margins were all investigated against a variable cost of carbon, using case study system created based on actual data from sites in the Boston area. As carbon costs increase, the optimal GSC system undergoes several distinct shifts in topology as it seeks new cost-minimal configurations. A comprehensive study of quantitative evaluation and performance of the model has been done using orthogonal arrays. Results were compared to top-down estimates from economic input-output life cycle assessment (EIO-LCA) models, to contrast remanufacturing GHG emission quantities with those from original equipment manufacturing operations. Introducing a carbon cost of $40/t CO2e increases modeled remanufacturing costs by 2.7% but also increases original equipment costs by 2.3%. The assembled work advances the theoretical modeling of optimal GSC systems and presents a rare case study of remanufactured appliances.Keywords: circular economy, extended producer responsibility, greenhouse gas emissions, industrial ecology, low carbon logistics, green supply chains
Procedia PDF Downloads 160431 AM/E/c Queuing Hub Maximal Covering Location Model with Fuzzy Parameter
Authors: M. H. Fazel Zarandi, N. Moshahedi
Abstract:
The hub location problem appears in a variety of applications such as medical centers, firefighting facilities, cargo delivery systems and telecommunication network design. The location of service centers has a strong influence on the congestion at each of them, and, consequently, on the quality of service. This paper presents a fuzzy maximal hub covering location problem (FMCHLP) in which travel costs between any pair of nodes is considered as a fuzzy variable. In order to consider the quality of service, we model each hub as a queue. Arrival rate follows Poisson distribution and service rate follows Erlang distribution. In this paper, at first, a nonlinear mathematical programming model is presented. Then, we convert it to the linear one. We solved the linear model using GAMS software up to 25 nodes and for large sizes due to the complexity of hub covering location problems, and simulated annealing algorithm is developed to solve and test the model. Also, we used possibilistic c-means clustering method in order to find an initial solution.Keywords: fuzzy modeling, location, possibilistic clustering, queuing
Procedia PDF Downloads 396430 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy
Authors: M. Regina Carreira-Lopez
Abstract:
Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.Keywords: hypernymy, information retrieval, lightweight ontology, resonance
Procedia PDF Downloads 126429 Multiobjective Economic Dispatch Using Optimal Weighting Method
Authors: Mandeep Kaur, Fatehgarh Sahib
Abstract:
The purpose of economic load dispatch is to allocate the required load demand between the available generation units such that the cost of operation is minimized. It is an optimization problem to find the most economical schedule of the generating units while satisfying load demand and operational constraints. The multiobjective optimization problem in which the engineer’s goal is to maximize or minimize not a single objective function but several objective functions simultaneously. The purpose of multiobjective problems in the mathematical programming framework is to optimize the different objective functions. Many approaches and methods have been proposed in recent years to solve multiobjective optimization problems. Weighting method has been applied to convert multiobjective optimization problems into scalar optimization. MATLAB 7.10 has been used to write the code for the complete algorithm with the help of genetic algorithm (GA). The validity of the proposed method has been demonstrated on a three-unit power system.Keywords: economic load dispatch, genetic algorithm, generating units, multiobjective optimization, weighting method
Procedia PDF Downloads 150428 Dynamic Thermal Modelling of a PEMFC-Type Fuel Cell
Authors: Marco Avila Lopez, Hasnae Ait-Douchi, Silvia De Los Santos, Badr Eddine Lebrouhi, Pamela Ramírez Vidal
Abstract:
In the context of the energy transition, fuel cell technology has emerged as a solution for harnessing hydrogen energy and mitigating greenhouse gas emissions. An in-depth study was conducted on a PEMFC-type fuel cell, with an initiation of an analysis of its operational principles and constituent components. Subsequently, the modelling of the fuel cell was undertaken using the Python programming language, encompassing both steady-state and transient regimes. In the case of the steady-state regime, the physical and electrochemical phenomena occurring within the fuel cell were modelled, with the assumption of uniform temperature throughout all cell compartments. Parametric identification was carried out, resulting in a remarkable mean error of only 1.62% when the model results were compared to experimental data documented in the literature. The dynamic model that was developed enabled the scrutiny of the fuel cell's response in terms of temperature and voltage under varying current conditions.Keywords: fuel cell, modelling, dynamic, thermal model, PEMFC
Procedia PDF Downloads 81427 Triassic Magmatism in Southern Beishan Orogen, Northwest China: Zircon U–Pb Geochronology, Petrogenesis and Tectonic Implications
Authors: Zengda Li
Abstract:
The tectonic evolution of the Beishan orogen, which forms part of the Central Asian Orogenic Belt, remains debated. This study reports the identification of three Triassic granitic plutons representing two distinct stages of magmatism in southern Beishan orogen. Zircon U–Pb dating constrains the early stage as 238–237 Ma and the late stage as 229–227 Ma. The granitoids belong to high-K calc-alkaline and shoshonitic series and exhibit alkalic-calcic and calc-alkalic features, and are weakly peraluminous rocks. Most of these granitoids are highly fractionated I-type and A-type granites. They have relatively high Isr values (0.7049–0.7086) and weak negative εNd(t) values of −1.5 to −2.1, with young Nd model ages of 1.04–0.91 Ga, indicating a crustal contribution. They also show markedly positive zircon εHf(t) values (+3.4 to +11.8) and two-stage Hf model ages of 1.06–0.69 Ga, indicating a mixture of mantle and crustal components. The lithospheric mantle beneath this region incorporating older subducted materials was metasomatized by fluids or melts. Partial melting of the metasomatized lithospheric mantle resulted in underplated magmas, which provided the heat and material input to generate the granitoids. The Middle Triassic granitic plutons show moderate negative Eu anomalies, enrichment of LILEs and depletion in Nb, Ta, and Ti suggesting partial melting of crustal components in response to the underplated mantle-derived magmas, probably linked to lithospheric delamination and asthenospheric upwelling. The Late Triassic granitic plutons show characteristics of post-orogenic granite with strong negative anomalies of Eu, Ba, Nb, Sr, P, and Ti, indicating fractional crystallization and crustal contamination during the emplacement process.Keywords: Triassic, magmatism, geochronology, petrogenesis, Beishan orogen
Procedia PDF Downloads 156426 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment
Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 462425 Low-Cost IoT System for Monitoring Ground Propagation Waves due to Construction and Traffic Activities to Nearby Construction
Authors: Lan Nguyen, Kien Le Tan, Bao Nguyen Pham Gia
Abstract:
Due to the high cost, specialized dynamic measurement devices for industrial lands are difficult for many colleges to equip for hands-on teaching. This study connects a dynamic measurement sensor and receiver utilizing an inexpensive Raspberry Pi 4 board, some 24-bit ADC circuits, a geophone vibration sensor, and embedded Python open-source programming. Gather and analyze signals for dynamic measuring, ground vibration monitoring, and structure vibration monitoring. The system may wirelessly communicate data to the computer and is set up as a communication node network, enabling real-time monitoring of background vibrations at various locations. The device can be utilized for a variety of dynamic measurement and monitoring tasks, including monitoring earthquake vibrations, ground vibrations from construction operations, traffic, and vibrations of building structures.Keywords: sensors, FFT, signal processing, real-time data monitoring, ground propagation wave, python, raspberry Pi 4
Procedia PDF Downloads 103424 Development of a Serial Signal Monitoring Program for Educational Purposes
Authors: Jungho Moon, Lae-Jeong Park
Abstract:
This paper introduces a signal monitoring program developed with a view to helping electrical engineering students get familiar with sensors with digital output. Because the output of digital sensors cannot be simply monitored by a measuring instrument such as an oscilloscope, students tend to have a hard time dealing with digital sensors. The monitoring program runs on a PC and communicates with an MCU that reads the output of digital sensors via an asynchronous communication interface. Receiving the sensor data from the MCU, the monitoring program shows time and/or frequency domain plots of the data in real time. In addition, the monitoring program provides a serial terminal that enables the user to exchange text information with the MCU while the received data is plotted. The user can easily observe the output of digital sensors and configure the digital sensors in real time, which helps students who do not have enough experiences with digital sensors. Though the monitoring program was programmed in the Matlab programming language, it runs without the Matlab since it was compiled as a standalone executable.Keywords: digital sensor, MATLAB, MCU, signal monitoring program
Procedia PDF Downloads 497423 Cooperative Jamming for Implantable Medical Device Security
Authors: Kim Lytle, Tim Talty, Alan Michaels, Jeff Reed
Abstract:
Implantable medical devices (IMDs) are medically necessary devices embedded in the human body that monitor chronic disorders or automatically deliver therapies. Most IMDs have wireless capabilities that allow them to share data with an offboard programming device to help medical providers monitor the patient’s health while giving the patient more insight into their condition. However, serious security concerns have arisen as researchers demonstrated these devices could be hacked to obtain sensitive information or harm the patient. Cooperative jamming can be used to prevent privileged information leaks by maintaining an adequate signal-to-noise ratio at the intended receiver while minimizing signal power elsewhere. This paper uses ray tracing to demonstrate how a low number of friendly nodes abiding by Bluetooth Low Energy (BLE) transmission regulations can enhance IMD communication security in an office environment, which in turn may inform how companies and individuals can protect their proprietary and personal information.Keywords: implantable biomedical devices, communication system security, array signal processing, ray tracing
Procedia PDF Downloads 114422 A Scalable Media Job Framework for an Open Source Search Engine
Authors: Pooja Mishra, Chris Pollett
Abstract:
This paper explores efficient ways to implement various media-updating features like news aggregation, video conversion, and bulk email handling. All of these jobs share the property that they are periodic in nature, and they all benefit from being handled in a distributed fashion. The data for these jobs also often comes from a social or collaborative source. We isolate the class of periodic, one round map reduce jobs as a useful setting to describe and handle media updating tasks. As such tasks are simpler than general map reduce jobs, programming them in a general map reduce platform could easily become tedious. This paper presents a MediaUpdater module of the Yioop Open Source Search Engine Web Portal designed to handle such jobs via an extension of a PHP class. We describe how to implement various media-updating tasks in our system as well as experiments carried out using these implementations on an Amazon Web Services cluster.Keywords: distributed jobs framework, news aggregation, video conversion, email
Procedia PDF Downloads 299421 A Mathematical Model for a Two-Stage Assembly Flow-Shop Scheduling Problem with Batch Delivery System
Authors: Saeedeh Ahmadi Basir, Mohammad Mahdavi Mazdeh, Mohammad Namakshenas
Abstract:
Manufacturers often dispatch jobs in batches to reduce delivery costs. However, sending several jobs in batches can have a negative effect on other scheduling-related objective functions such as minimizing the number of tardy jobs which is often used to rate managers’ performance in many manufacturing environments. This paper aims to minimize the number of weighted tardy jobs and the sum of delivery costs of a two-stage assembly flow-shop problem in a batch delivery system. We present a mixed-integer linear programming (MILP) model to solve the problem. As this is an MILP model, the commercial solver (the CPLEX solver) is not guaranteed to find the optimal solution for large-size problems at a reasonable amount of time. We present several numerical examples to confirm the accuracy of the model.Keywords: scheduling, two-stage assembly flow-shop, tardy jobs, batched delivery system
Procedia PDF Downloads 461420 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 162419 Transformer Design Optimization Using Artificial Intelligence Techniques
Authors: Zakir Husain
Abstract:
Main objective of a power transformer design optimization problem requires minimizing the total overall cost and/or mass of the winding and core material by satisfying all possible constraints obligatory by the standards and transformer user requirement. The constraints include appropriate limits on winding fill factor, temperature rise, efficiency, no-load current and voltage regulation. The design optimizations tasks are a constrained minimum cost and/or mass solution by optimally setting the parameters, geometry and require magnetic properties of the transformer. In this paper, present the above design problems have been formulated by using genetic algorithm (GA) and simulated annealing (SA) on the MATLAB platform. The importance of the presented approach is stems for two main features. First, proposed technique provides reliable and efficient solution for the problem of design optimization with several variables. Second, it guaranteed to obtained solution is global optimum. This paper includes a demonstration of the application of the genetic programming GP technique to transformer design.Keywords: optimization, power transformer, genetic algorithm (GA), simulated annealing technique (SA)
Procedia PDF Downloads 584418 Modeling and Simulation Frameworks for Cloud Computing Environment: A Critical Evaluation
Authors: Abul Bashar
Abstract:
The recent surge in the adoption of cloud computing systems by various organizations has brought forth the challenge of evaluating their performance. One of the major issues faced by the cloud service providers and customers is to assess the ability of cloud computing systems to provide the desired services in accordance to the QoS and SLA constraints. To this end, an opportunity exists to develop means to ensure that the desired performance levels of such systems are met under simulated environments. This will eventually minimize the service disruptions and performance degradation issues during the commissioning and operational phase of cloud computing infrastructure. However, it is observed that several simulators and modelers are available for simulating the cloud computing systems. Therefore, this paper presents a critical evaluation of the state-of-the-art modeling and simulation frameworks applicable to cloud computing systems. It compares the prominent simulation frameworks in terms of the API features, programming flexibility, operating system requirements, supported services, licensing needs and popularity. Subsequently, it provides recommendations regarding the choice of the most appropriate framework for researchers, administrators and managers of cloud computing systems.Keywords: cloud computing, modeling framework, performance evaluation, simulation tools
Procedia PDF Downloads 503417 Generative AI: A Comparison of Conditional Tabular Generative Adversarial Networks and Conditional Tabular Generative Adversarial Networks with Gaussian Copula in Generating Synthetic Data with Synthetic Data Vault
Authors: Lakshmi Prayaga, Chandra Prayaga. Aaron Wade, Gopi Shankar Mallu, Harsha Satya Pola
Abstract:
Synthetic data generated by Generative Adversarial Networks and Autoencoders is becoming more common to combat the problem of insufficient data for research purposes. However, generating synthetic data is a tedious task requiring extensive mathematical and programming background. Open-source platforms such as the Synthetic Data Vault (SDV) and Mostly AI have offered a platform that is user-friendly and accessible to non-technical professionals to generate synthetic data to augment existing data for further analysis. The SDV also provides for additions to the generic GAN, such as the Gaussian copula. We present the results from two synthetic data sets (CTGAN data and CTGAN with Gaussian Copula) generated by the SDV and report the findings. The results indicate that the ROC and AUC curves for the data generated by adding the layer of Gaussian copula are much higher than the data generated by the CTGAN.Keywords: synthetic data generation, generative adversarial networks, conditional tabular GAN, Gaussian copula
Procedia PDF Downloads 84416 Blending Effects on Crude Oil Stability: An Experimental Study
Authors: Muheddin Hamza, Entisar Etter
Abstract:
This study is a part of investigating the possibility of blending two crude oils obtained from Libyan oil fields, namely crude oil (A) and crude oil (B) with different ratios, prior to blending the crude oils have to be compatible in order to avoid phase out and precipitation of asphaltene from the bulk of crude. The physical properties of both crudes such as density, viscosity, pour point and sulphur content were measured according to (ASTM) method. To examine the stability of both crudes and their blends, the oil compatibility model using microscopic, colloidal instability index (CII) using SARA analysis and asphaltene stabilization test using Turbiscan tests were conducted in the Libyan Petroleum Institute laboratories. Compatibility tests were carried out with both crude oils, the insolubility number (IN), and the solubility blending number (SBN), for both crude oils and their blends were calculated. The criteria for compatibility of any blend is that the volume average solubility blending number (SBN) is greater than the insolubility number (IN) of any component in the blend, the results indicated that both crudes were compatible. To support the results of compatibility tests the SARA analysis was done for the fractional determination of (saturates, aromatics, resins and asphaltenes) content. From this result, the colloidal Instability index (CII) and resin to asphaltenes ratio (R/A) were calculated for crudes and their blends. The results show that crude oil (B) which has higher (R/A) and lower (CII) is more stable than crude oil (A) and as the ratio of crude (B) increases in the blend the (CII) and (R/A) were improved, and the blends becomes more stable. Asphaltene stabilization test was also conducted for the crudes and their blends using Turbiscan MA200 according to the standard test method ASTM D7061-04, the Turbiscan shows that the crude (B) is more stable than crude (A) which shows a fair tendency. The (CII) and (R/A) were compared with the solubility number (SBN) for each crude and the blends along with Turbiscan results. The solubility blending number (SBN) of the crudes and their blends show that the crudes are compatible, also by comparing (R/A) and (SBN) values of the blends, it can be seen that they are complements of each other. All the experimental results show that the blends of both crudes are more stability.Keywords: asphaltene, crude oil, compatibility, oil blends, resin, SARA
Procedia PDF Downloads 513415 Resting-State Functional Connectivity Analysis Using an Independent Component Approach
Authors: Eric Jacob Bacon, Chaoyang Jin, Dianning He, Shuaishuai Hu, Lanbo Wang, Han Li, Shouliang Qi
Abstract:
Objective: Refractory epilepsy is a complicated type of epilepsy that can be difficult to diagnose. Recent technological advancements have made resting-state functional magnetic resonance (rsfMRI) a vital technique for studying brain activity. However, there is still much to learn about rsfMRI. Investigating rsfMRI connectivity may aid in the detection of abnormal activities. In this paper, we propose studying the functional connectivity of rsfMRI candidates to diagnose epilepsy. Methods: 45 rsfMRI candidates, comprising 26 with refractory epilepsy and 19 healthy controls, were enrolled in this study. A data-driven approach known as independent component analysis (ICA) was used to achieve our goal. First, rsfMRI data from both patients and healthy controls were analyzed using group ICA. The components that were obtained were then spatially sorted to find and select meaningful ones. A two-sample t-test was also used to identify abnormal networks in patients and healthy controls. Finally, based on the fractional amplitude of low-frequency fluctuations (fALFF), a chi-square statistic test was used to distinguish the network properties of the patient and healthy control groups. Results: The two-sample t-test analysis yielded abnormal in the default mode network, including the left superior temporal lobe and the left supramarginal. The right precuneus was found to be abnormal in the dorsal attention network. In addition, the frontal cortex showed an abnormal cluster in the medial temporal gyrus. In contrast, the temporal cortex showed an abnormal cluster in the right middle temporal gyrus and the right fronto-operculum gyrus. Finally, the chi-square statistic test was significant, producing a p-value of 0.001 for the analysis. Conclusion: This study offers evidence that investigating rsfMRI connectivity provides an excellent diagnosis option for refractory epilepsy.Keywords: ICA, RSN, refractory epilepsy, rsfMRI
Procedia PDF Downloads 78414 A Parallel Implementation of Artificial Bee Colony Algorithm within CUDA Architecture
Authors: Selcuk Aslan, Dervis Karaboga, Celal Ozturk
Abstract:
Artificial Bee Colony (ABC) algorithm is one of the most successful swarm intelligence based metaheuristics. It has been applied to a number of constrained or unconstrained numerical and combinatorial optimization problems. In this paper, we presented a parallelized version of ABC algorithm by adapting employed and onlooker bee phases to the Compute Unified Device Architecture (CUDA) platform which is a graphical processing unit (GPU) programming environment by NVIDIA. The execution speed and obtained results of the proposed approach and sequential version of ABC algorithm are compared on functions that are typically used as benchmarks for optimization algorithms. Tests on standard benchmark functions with different colony size and number of parameters showed that proposed parallelization approach for ABC algorithm decreases the execution time consumed by the employed and onlooker bee phases in total and achieved similar or better quality of the results compared to the standard sequential implementation of the ABC algorithm.Keywords: Artificial Bee Colony algorithm, GPU computing, swarm intelligence, parallelization
Procedia PDF Downloads 379413 Exact Energy Spectrum and Expectation Values of the Inverse Square Root Potential Model
Authors: Benedict Ita, Peter Okoi
Abstract:
In this work, the concept of the extended Nikiforov-Uvarov technique is discussed and employed to obtain the exact bound state energy eigenvalues and the corresponding normalized eigenfunctions of the inverse square root potential. With expressions for the exact energy eigenvalues and corresponding eigenfunctions, the expressions for the expectation values of the inverse separation-squared, kinetic energy, and the momentum-squared of the potential are presented using the Hellmann Feynman theorem. For visualization, algorithms written and implemented in Python language are used to generate tables and plots for l-states of the energy eigenvalues and some expectation values. The results obtained here may find suitable applications in areas like atomic and molecular physics, chemical physics, nuclear physics, and solid-state physics.Keywords: Schrodinger equation, Nikoforov-Uvarov method, inverse square root potential, diatomic molecules, Python programming, Hellmann-Feynman theorem, second order differential equation, matrix algebra
Procedia PDF Downloads 24412 Non-Steroidal Anti-inflammatory Drugs, Plant Extracts, and Characterized Microparticles to Modulate Antimicrobial Resistance of Epidemic Meca Positive S. Aureus of Dairy Origin
Authors: Amjad I. Aqib, Shanza R. Khan, Tanveer Ahmad, Syed A. R. Shah, Muhammad A. Naseer, Muhammad Shoaib, Iqra Sarwar, Muhammad F. A. Kulyar, Zeeshan A. Bhutta, Mumtaz A. Khan, Mahboob Ali, Khadija Yasmeen
Abstract:
The current study focused on resistance modulation of dairy linked epidemic mec A positive S. aureus for resistance modulation by plant extract (Eucalyptus globolus, Calotropis procera), NSAIDs, and star like microparticles. Zinc oxide {ZnO}c and {Zn (OH)₂} microparticles were synthesized by solvothermal method and characterized by calcination, X-ray diffraction (XRD), and scanning electron microscope (SEM). Plant extracts were prepared by the Soxhlet extraction method. The study found 34% of subclinical samples (n=200) positive for S. aureus from dairy milk having significant (p < 0.05) association of assumed risk factors with pathogen. The antimicrobial assay showed 55, 42, 41, and 41% of S. aureus resistant to oxacillin, ciprofloxacin, streptomycin, and enoxacin. Amoxicillin showed the highest percentage of increase in zone of inhibitions (ZOI) at 100mg of Calotropis procera extract (31.29%) followed by 1mg/mL (28.91%) and 10mg/mL (21.68%) of Eucalyptus globolus. Amoxicillin increased ZOI by 42.85, 37.32, 29.05, and 22.78% in combination with 500 ug/ml with each of diclofenac, aspirin, ibuprofen, and meloxicam, respectively. Fractional inhibitory concentration indices (FICIs) showed synergism of amoxicillin with diclofenac and aspirin and indifferent synergy with ibuprofen and meloxicam. The preliminary in vitro finding of combination of microparticles with amoxicillin proved to be synergistic, giving rise to 26.74% and 14.85% increase in ZOI of amoxicillin in combination with zinc oxide and zinc hydroxide, respectively. The modulated antimicrobial resistance incurred by NSAIDs, plant extracts, and microparticles against pathogenic S. aureus invite immediate attention to probe alternative antimicrobial sources.Keywords: antimicrobial resistance, dairy milk, nanoparticles, NSIDs, plant extracts, resistance modulation, S. aureus
Procedia PDF Downloads 214411 Perspectives and Outcomes of a Long and Shorter Community Mental Health Program
Authors: Danielle Klassen, Reiko Yeap, Margo Schmitt-Boshnick, Scott Oddie
Abstract:
The development of the 7-week Alberta Happiness Basics program was initiated in 2010 in response to the need for community mental health programming. This provincial wide program aims to increase overall happiness and reduce negative thoughts and feelings through a positive psychology intervention. While the 7-week program has proven effective, a shortened 4-week program has additionally been developed to address client needs. In this study, participants were interviewed to determine if the 4- and 7-week programs had similar success of producing lasting behavior change at 3, 6, and 9 months post-program. A health quality of life (HQOL) measure was also used to compare the two programs and examine patient outcomes. Quantitative and qualitative analysis showed significant improvements in HQOL and sustainable behavior change for both programs. Findings indicate that the shorter, patient-centered program was effective in increasing happiness and reducing negative thoughts and feelings.Keywords: primary care, mental health, depression, short duration
Procedia PDF Downloads 271