Search results for: cost analysis.
9473 Statistical Texture Analysis
Authors: G. N. Srinivasan, G. Shobha
Abstract:
This paper presents an overview of the methodologies and algorithms for statistical texture analysis of 2D images. Methods for digital-image texture analysis are reviewed based on available literature and research work either carried out or supervised by the authors.Keywords: Image Texture, Texture Analysis, Statistical Approaches, Structural approaches, spectral approaches, Morphological approaches, Fractals, Fourier Transforms, Gabor Filters, Wavelet transforms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9479472 The Documentary Analysis of Meta-Analysis Research in Violence of Media
Authors: Proud Arunrangsiwed
Abstract:
The part of “future direction” in the findings of meta-analysis could provide the great direction to conduct the future studies. This study, “The Documentary Analysis of Meta-Analysis Research in Violence of Media” would conclude “future directions” out of 10 meta-analysis papers. The purposes of this research are to find an appropriate research design or an appropriate methodology for the future research related to the topic, “violence of media”. Further research needs to explore by longitudinal and experimental design, and also needs to have a careful consideration about age effects, time spent effects, enjoyment effects and ordinary lifestyle of each media consumer.
Keywords: Aggressive, future direction, meta-analysis, media, violence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27079471 Design Optimisation of Compound Parabolic Concentrator (CPC) for Improved Performance
Authors: M. M. Isa, R. Abd-Rahman, H. H. Goh
Abstract:
A compound parabolic concentrator (CPC) is a wellknown non-imaging concentrator that will concentrate the solar radiation onto receiver (PV cell). One of disadvantage of CPC is has tall and narrow height compared to its diameter entry aperture area. Therefore, for economic reason, a truncation had been done by removed from the top of the full height CPC. This also will lead to the decreases of concentration ratio but it will be negligible. In this paper, the flux distribution of untruncated and truncated 2-D hollow compound parabolic trough concentrator (hCPTC) design is presented. The untruncated design has initial height H=193.4mm with concentration ratio C_(2-D)=4. This paper presents the optical simulation of compound parabolic trough concentrator using raytracing software TracePro. Results showed that, after the truncation, the height of CPC reduced 45% from initial height with the geometrical concentration ratio only decrease 10%. Thus, the cost of reflector and material dielectric usage can be saved especially at manufacturing site.Keywords: Compound parabolic trough concentrator, optical modelling, ray-tracing analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33459470 A Study of the Costs and Benefits of Smart City Projects Including the Scenario of Public-Private Partnerships
Authors: Patrick T. I. Lam, Wenjing Yang
Abstract:
A smart city project embraces benefits and costs which can be classified under direct and indirect categories. Externalities come into the picture, but they are often difficult to quantify. Despite this barrier, policy makers need to carry out cost-benefit analysis to justify the huge investments needed to make a city smart. The recent trend is towards the engagement of the private sector to utilize their resources and expertise, especially in the Information and Communication Technology (ICT) areas, where innovations blossom. This study focuses on the identification of costs (on a life cycle basis) and benefits associated with smart city project developments based on a comprehensive literature review and case studies, where public-private partnerships would warrant consideration, the related costs and benefits are highlighted. The findings will be useful for policy makers of cities.
Keywords: Costs and benefits, identification, public-private partnerships, smart city projects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17059469 Bridge Health Monitoring: A Review
Authors: Mohammad Bakhshandeh
Abstract:
Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.
Keywords: Structural health monitoring, bridge health monitoring, sensor-based methods, machine-learning algorithms, model-based techniques, sensor placement, data acquisition, data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3329468 Big Brain: A Single Database System for a Federated Data Warehouse Architecture
Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf
Abstract:
Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.Keywords: Data integration, data warehousing, federated architecture, online analytical processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7139467 Proposal for a Ultra Low Voltage NAND gate to withstand Power Analysis Attacks
Authors: Omid Mirmotahari, Yngvar Berg
Abstract:
In this paper we promote the Ultra Low Voltage (ULV) NAND gate to replace either partly or entirely the encryption block of a design to withstand power analysis attack.
Keywords: Differential Power Analysis (DPA), Low Voltage (LV), Ultra Low Voltage (ULV), Floating-Gate (FG), supply current analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19589466 Heuristic Methods for the Capacitated Location- Allocation Problem with Stochastic Demand
Authors: Salinee Thumronglaohapun
Abstract:
The proper number and appropriate locations of service centers can save cost, raise revenue and gain more satisfaction from customers. Establishing service centers is high-cost and difficult to relocate. In long-term planning periods, several factors may affect the service. One of the most critical factors is uncertain demand of customers. The opened service centers need to be capable of serving customers and making a profit although the demand in each period is changed. In this work, the capacitated location-allocation problem with stochastic demand is considered. A mathematical model is formulated to determine suitable locations of service centers and their allocation to maximize total profit for multiple planning periods. Two heuristic methods, a local search and genetic algorithm, are used to solve this problem. For the local search, five different chances to choose each type of moves are applied. For the genetic algorithm, three different replacement strategies are considered. The results of applying each method to solve numerical examples are compared. Both methods reach to the same best found solution in most examples but the genetic algorithm provides better solutions in some cases.Keywords: Location-allocation problem, stochastic demand, local search, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7549465 On-line Testing of Software Components for Diagnosis of Embedded Systems
Authors: Thi-Quynh Bui, Oum-El-Kheir Aktouf
Abstract:
This paper studies the dependability of componentbased applications, especially embedded ones, from the diagnosis point of view. The principle of the diagnosis technique is to implement inter-component tests in order to detect and locate the faulty components without redundancy. The proposed approach for diagnosing faulty components consists of two main aspects. The first one concerns the execution of the inter-component tests which requires integrating test functionality within a component. This is the subject of this paper. The second one is the diagnosis process itself which consists of the analysis of inter-component test results to determine the fault-state of the whole system. Advantage of this diagnosis method when compared to classical redundancy faulttolerant techniques are application autonomy, cost-effectiveness and better usage of system resources. Such advantage is very important for many systems and especially for embedded ones.Keywords: Dependability, diagnosis, middlewares, embeddedsystems, fault tolerance, inter-component testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17499464 General Process Control for Intelligent Systems
Authors: Radovan Holubek, Matus Vlasek, Peter Kostal
Abstract:
Development of intelligent assembly cell conception includes new solution kind of how to create structures of automated and flexible assembly system. The current trend of the final product quality increasing is affected by time analysis of the entire manufacturing process. The primary requirement of manufacturing is to produce as many products as soon as possible, at the lowest possible cost, but of course with the highest quality. Such requirements may be satisfied only if all the elements entering and affecting the production cycle are in a fully functional condition. These elements consist of sensory equipment and intelligent control elements that are essential for building intelligent manufacturing systems. Intelligent behavior of the system as the control system will repose on monitoring of important parameters of the system in the real time. Intelligent manufacturing system itself should be a system that can flexibly respond to changes in entering and exiting the process in interaction with the surroundings.
Keywords: Control system, intelligent manufacturing / assemble systems, manufacturing, monitoring process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17029463 Antibiotic Prescribing in the Acute Care in Iraq
Authors: Ola A. Nassr, Ali M. Abd Alridha, Rua A. Naser, Rasha S. Abbas
Abstract:
Background: Excessive and inappropriate use of antimicrobial agents among hospitalized patients remains an important patient safety and public health issue worldwide. Not only does this behavior incur unnecessary cost but it is also associated with increased morbidity and mortality. The objective of this study is to obtain an insight into the prescribing patterns of antibiotics in surgical and medical wards, to help identify a scope for improvement in service delivery. Method: A simple point prevalence survey included a convenience sample of 200 patients admitted to medical and surgical wards in a government teaching hospital in Baghdad between October 2017 and April 2018. Data were collected by a trained pharmacy intern using a standardized form. Patient’s demographics and details of the prescribed antibiotics, including dose, frequency of dosing and route of administration, were reported. Patients were included if they had been admitted at least 24 hours before the survey. Patients under 18 years of age, having a diagnosis of cancer or shock, or being admitted to the intensive care unit, were excluded. Data were checked and entered by the authors into Excel and were subjected to frequency analysis, which was carried out on anonymized data to protect patient confidentiality. Results: Overall, 88.5% of patients (n=177) received 293 antibiotics during their hospital admission, with a small variation between wards (80%-97%). The average number of antibiotics prescribed per patient was 1.65, ranging from 1.3 for medical patients to 1.95 for surgical patients. Parenteral third-generation cephalosporins were the most commonly prescribed at a rate of 54.3% (n=159) followed by nitroimidazole 29.4% (n=86), quinolones 7.5% (n=22) and macrolides 4.4% (n=13), while carbapenems and aminoglycosides were the least prescribed together accounting for only 4.4% (n=13). The intravenous route was the most common route of administration, used for 96.6% of patients (n=171). Indications were reported in only 63.8% of cases. Culture to identify pathogenic organisms was employed in only 0.5% of cases. Conclusion: Broad-spectrum antibiotics are prescribed at an alarming rate. This practice may provoke antibiotic resistance and adversely affect the patient outcome. Implementation of an antibiotic stewardship program is warranted to enhance the efficacy, safety and cost-effectiveness of antimicrobial agents.
Keywords: Acute care, antibiotic misuse, Iraq, prescribing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9949462 A Survey of the Applications of Sentiment Analysis
Authors: Pingping Lin, Xudong Luo
Abstract:
Natural language often conveys emotions of speakers. Therefore, sentiment analysis on what people say is prevalent in the field of natural language process and has great application value in many practical problems. Thus, to help people understand its application value, in this paper, we survey various applications of sentiment analysis, including the ones in online business and offline business as well as other types of its applications. In particular, we give some application examples in intelligent customer service systems in China. Besides, we compare the applications of sentiment analysis on Twitter, Weibo, Taobao and Facebook, and discuss some challenges. Finally, we point out the challenges faced in the applications of sentiment analysis and the work that is worth being studied in the future.Keywords: Natural language processing, sentiment analysis, application, online comments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9569461 Fine-Grained Sentiment Analysis: Recent Progress
Authors: Jie Liu, Xudong Luo, Pingping Lin, Yifan Fan
Abstract:
Facebook, Twitter, Weibo, and other social media and significant e-commerce sites generate a massive amount of online texts, which can be used to analyse people’s opinions or sentiments for better decision-making. So, sentiment analysis, especially the fine-grained sentiment analysis, is a very active research topic. In this paper, we survey various methods for fine-grained sentiment analysis, including traditional sentiment lexicon-based methods, ma-chine learning-based methods, and deep learning-based methods in aspect/target/attribute-based sentiment analysis tasks. Besides, we discuss their advantages and problems worthy of careful studies in the future.
Keywords: sentiment analysis, fine-grained, machine learning, deep learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24159460 Design of Multiple Clouds Based Global Performance Evaluation Service Broker System
Authors: Dong-Jae Kang, Nam-Woo Kim, Duk-Joo Son, Sung-In Jung
Abstract:
According to dramatic growth of internet services, an easy and prompt service deployment has been important for internet service providers to successfully maintain time-to-market. Before global service deployment, they have to pay the big cost for service evaluation to make a decision of the proper system location, system scale, service delay and so on. But, intra-Lab evaluation tends to have big gaps in the measured data compared with the realistic situation, because it is very difficult to accurately expect the local service environment, network congestion, service delay, network bandwidth and other factors. Therefore, to resolve or ease the upper problems, we propose multiple cloud based GPES Broker system and use case that helps internet service providers to alleviate the above problems in beta release phase and to make a prompt decision for their service launching. By supporting more realistic and reliable evaluation information, the proposed GPES Broker system saves the service release cost and enables internet service provider to make a prompt decision about their service launching to various remote regions.
Keywords: GPES Broker system, Cloud Service Broker, Multiple Cloud, Global performance evaluation service (GPES), Service provisioning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20499459 Dynamic Admission Control Based on Effective Demand for Next Generation Wireless Networks
Authors: Somenath Mukherjee, Rajdeep Ray, Raj Kumar Samanta, Mofazzal H. Khondekar, Gautam Sanyal
Abstract:
In next generation wireless networks (i.e., 4G and beyond), one of the main objectives is to ensure highest level of customer satisfaction in terms of data transfer speed, decrease in cost and delay, non-rejection and no drop of calls, availability of ‘always-on’ connectivity and services, continuity of connected services, hastle-free roaming in addition to the convenience of use of network services from anywhere and anytime. To take care of these requirements effectively, internet service providers (ISPs) and network planners have to go for major capacity enhancement of network resources and at the same time these resources are to be used effectively and efficiently to reduce cost and to increase revenue. In this work, the effective bandwidth available in a Mobile Switching Center (MSC) of a wireless network providing multi-class multimedia services is analyzed. Bandwidth requirement of the users for a customized Quality of Service (QoS) is estimated. The findings of the QoS estimation are applied for the capacity planning and admission control of the multi-class traffic flows coming into the MSC.
Keywords: Next generation wireless network, mobile switching center, multi-class traffic, quality of service, admission control, effective bandwidth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8459458 Neuromarketing: Discovering the Somathyc Marker in the Consumer´s Brain
Authors: Mikel Alonso López, María Francisca Blasco López, Víctor Molero Ayala
Abstract:
The present study explains the somatic marker theory of Antonio Damasio, which indicates that when making a decision, the stored or possible future scenarios (future memory) images allow people to feel for a moment what would happen when they make a choice, and how this is emotionally marked. This process can be conscious or unconscious. The development of new Neuromarketing techniques such as functional magnetic resonance imaging (fMRI), carries a greater understanding of how the brain functions and consumer behavior. In the results observed in different studies using fMRI, the evidence suggests that the somatic marker and future memories influence the decision-making process, adding a positive or negative emotional component to the options. This would mean that all decisions would involve a present emotional component, with a rational cost-benefit analysis that can be performed later.
Keywords: Emotions, decision making, somatic marker, consumer´s brain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21389457 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding
Authors: C. Kalamani, K. Paramasivam
Abstract:
Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.
Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12789456 Enhanced Automated Teller Machine Using Short Message Service Authentication Verification
Authors: Rasheed Gbenga Jimoh, Akinbowale Nathaniel Babatunde
Abstract:
The use of Automated Teller Machine (ATM) has become an important tool among commercial banks, customers of banks have come to depend on and trust the ATM conveniently meet their banking needs. Although the overwhelming advantages of ATM cannot be over-emphasized, its alarming fraud rate has become a bottleneck in it’s full adoption in Nigeria. This study examined the menace of ATM in the society another cost of running ATM services by banks in the country. The researcher developed a prototype of an enhanced Automated Teller Machine Authentication using Short Message Service (SMS) Verification. The developed prototype was tested by Ten (10) respondents who are users of ATM cards in the country and the data collected was analyzed using Statistical Package for Social Science (SPSS). Based on the results of the analysis, it is being envisaged that the developed prototype will go a long way in reducing the alarming rate of ATM fraud in Nigeria.
Keywords: ATM, ATM Fraud, E-banking, Prototyping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21779455 A Blockchain-Based Privacy-Preserving Physical Delivery System
Authors: Shahin Zanbaghi, Saeed Samet
Abstract:
The internet has transformed the way we shop. Previously, most of our purchases came in the form of shopping trips to a nearby store. Now, it is as easy as clicking a mouse. We have to be constantly vigilant about our personal information. In this work, our proposed approach is to encrypt the information printed on the physical packages, which include personal information in plain text using a symmetric encryption algorithm; then, we store that encrypted information into a Blockchain network rather than storing them in companies or corporations centralized databases. We present, implement and assess a blockchain-based system using Ethereum smart contracts. We present detailed algorithms that explain the details of our smart contract. We present the security, cost and performance analysis of the proposed method. Our work indicates that the proposed solution is economically attainable and provides data integrity, security, transparency and data traceability.
Keywords: Blockchain, Ethereum, smart contract, commit-reveal scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4829454 Factors of Successful Wooden Furniture Design Process
Authors: S. Choodoung, U. Smutkupt
Abstract:
This study systemizes processes and methods in wooden furniture design that contains uniqueness in function and aesthetics. The study was done by research and analysis for designer-s consideration factors that affect function and production. Therefore, the study result indicates that such factors are design process (planning for design, product specifications, concept design, product architecture, industrial design, production), design evaluation as well as wooden furniture design dependent factors i.e. art (art style; furniture history, form), functionality (the strength and durability, area place, using), material (appropriate to function, wood mechanical properties), joints, cost, safety, and social responsibility. Specifically, all aforementioned factors affect good design. Resulting from direct experience gained through user-s usage, the designer must design the wooden furniture systemically and effectively. As a result, this study selected dinning armchair as a case study with all involving factors and all design process stated in this study.Keywords: Furniture Design, Function Design, Aesthetic, Wooden Furniture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 100219453 Time Effective Structural Frequency Response Testing with Oblique Impact
Authors: Khoo Shin Yee, Lian Yee Cheng, Ong Zhi Chao, Zubaidah Ismail, Siamak Noroozi
Abstract:
Structural frequency response testing is accurate in identifying the dynamic characteristic of a machinery structure. In practical perspective, conventional structural frequency response testing such as experimental modal analysis with impulse technique (also known as “impulse testing”) has limitation especially on its long acquisition time. The high acquisition time is mainly due to the redundancy procedure where the engineer has to repeatedly perform the test in 3 directions, namely the axial-, horizontal- and vertical-axis, in order to comprehensively define the dynamic behavior of a 3D structure. This is unfavorable to numerous industries where the downtime cost is high. This study proposes to reduce the testing time by using oblique impact. Theoretically, a single oblique impact can induce significant vibration responses and vibration modes in all the 3 directions. Hence, the acquisition time with the implementation of the oblique impulse technique can be reduced by a factor of three (i.e. for a 3D dynamic system). This study initiates an experimental investigation of impulse testing with oblique excitation. A motor-driven test rig has been used for the testing purpose. Its dynamic characteristic has been identified using the impulse testing with the conventional normal impact and the proposed oblique impact respectively. The results show that the proposed oblique impulse testing is able to obtain all the desired natural frequencies in all 3 directions and thus providing a feasible solution for a fast and time effective way of conducting the impulse testing.Keywords: Frequency response function, impact testing, modal analysis, oblique angle, oblique impact.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9349452 A Survey of Sentiment Analysis Based on Deep Learning
Authors: Pingping Lin, Xudong Luo, Yifan Fan
Abstract:
Sentiment analysis is a very active research topic. Every day, Facebook, Twitter, Weibo, and other social media, as well as significant e-commerce websites, generate a massive amount of comments, which can be used to analyse peoples opinions or emotions. The existing methods for sentiment analysis are based mainly on sentiment dictionaries, machine learning, and deep learning. The first two kinds of methods rely on heavily sentiment dictionaries or large amounts of labelled data. The third one overcomes these two problems. So, in this paper, we focus on the third one. Specifically, we survey various sentiment analysis methods based on convolutional neural network, recurrent neural network, long short-term memory, deep neural network, deep belief network, and memory network. We compare their futures, advantages, and disadvantages. Also, we point out the main problems of these methods, which may be worthy of careful studies in the future. Finally, we also examine the application of deep learning in multimodal sentiment analysis and aspect-level sentiment analysis.Keywords: Natural language processing, sentiment analysis, document analysis, multimodal sentiment analysis, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20099451 Forecasting Optimal Production Program Using Profitability Optimization by Genetic Algorithm and Neural Network
Authors: Galal H. Senussi, Muamar Benisa, Sanja Vasin
Abstract:
In our business field today, one of the most important issues for any enterprises is cost minimization and profit maximization. Second issue is how to develop a strong and capable model that is able to give us desired forecasting of these two issues. Many researches deal with these issues using different methods. In this study, we developed a model for multi-criteria production program optimization, integrated with Artificial Neural Network.
The prediction of the production cost and profit per unit of a product, dealing with two obverse functions at same time can be extremely difficult, especially if there is a great amount of conflict information about production parameters.
Feed-Forward Neural Networks are suitable for generalization, which means that the network will generate a proper output as a result to input it has never seen. Therefore, with small set of examples the network will adjust its weight coefficients so the input will generate a proper output.
This essential characteristic is of the most important abilities enabling this network to be used in variety of problems spreading from engineering to finance etc.
From our results as we will see later, Feed-Forward Neural Networks has a strong ability and capability to map inputs into desired outputs.
Keywords: Project profitability, multi-objective optimization, genetic algorithm, Pareto set, Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20599450 Performance Assessment of Wet-Compression Gas Turbine Cycle with Turbine Blade Cooling
Authors: Kyoung Hoon Kim
Abstract:
Turbine blade cooling is considered as the most effective way of maintaining high operating temperature making use of the available materials, and turbine systems with wet compression have a potential for future power generation because of high efficiency and high specific power with a relatively low cost. In this paper performance analysis of wet-compression gas turbine cycle with turbine blade cooling is carried out. The wet compression process is analytically modeled based on non-equilibrium droplet evaporation. Special attention is paid for the effects of pressure ratio and water injection ratio on the important system variables such as ratio of coolant fluid flow, fuel consumption, thermal efficiency and specific power. Parametric studies show that wet compression leads to insignificant improvement in thermal efficiency but significant enhancement of specific power in gas turbine systems with turbine blade cooling.Keywords: Water injection, wet compression, gas turbine, turbine blade cooling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34119449 Aircraft Selection Process Using Reference Linear Combination in Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This paper introduces a new method for multiplecriteria decision making (MCDM) that avoids order reversal and ensures consistency in decision-making. The proposed method involves range targeting of benefit and cost criteria vectors for range normalization of the initial decision matrix. The Reference Linear Combination (RLC) is used to avoid the rank reversal problem. The preference order generated from the target score matrix does not require relative comparisons between alternatives but relies on a chosen reference solution point after transforming the original decision matrix into an MCDM problem by specifying the minimum and maximum bounds of each criterion. The efficiency and applicability of the proposed RLC method were demonstrated in the selection of commercial passenger aircraft.
Keywords: Aircraft selection, reference linear combination (RLC), multiple criteria decision-making, MCDM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3719448 Observations about the Principal Components Analysis and Data Clustering Techniques in the Study of Medical Data
Authors: Cristina G. Dascâlu, Corina Dima Cozma, Elena Carmen Cotrutz
Abstract:
The medical data statistical analysis often requires the using of some special techniques, because of the particularities of these data. The principal components analysis and the data clustering are two statistical methods for data mining very useful in the medical field, the first one as a method to decrease the number of studied parameters, and the second one as a method to analyze the connections between diagnosis and the data about the patient-s condition. In this paper we investigate the implications obtained from a specific data analysis technique: the data clustering preceded by a selection of the most relevant parameters, made using the principal components analysis. Our assumption was that, using the principal components analysis before data clustering - in order to select and to classify only the most relevant parameters – the accuracy of clustering is improved, but the practical results showed the opposite fact: the clustering accuracy decreases, with a percentage approximately equal with the percentage of information loss reported by the principal components analysis.Keywords: Data clustering, medical data, principal components analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15059447 Modeling and Simulation of Axial Fan Using CFD
Authors: Hemant Kumawat
Abstract:
Axial flow fans, while incapable of developing high pressures, they are well suitable for handling large volumes of air at relatively low pressures. In general, they are low in cost and possess good efficiency, and can have blades of airfoil shape. Axial flow fans show good efficiencies, and can operate at high static pressures if such operation is necessary. Our objective is to model and analyze the flow through AXIAL FANS using CFD Software and draw inference from the obtained results, so as to get maximum efficiency. The performance of an axial fan was simulated using CFD and the effect of variation of different parameters such as the blade number, noise level, velocity, temperature and pressure distribution on the blade surface was studied. This paper aims to present a final 3D CAD model of axial flow fan. Adapting this model to the available components in the market, the first optimization was done. After this step, CFX flow solver is used to do the necessary numerical analyses on the aerodynamic performance of this model. This analysis results in a final optimization of the proposed 3D model which is presented in this article.
Keywords: ANSYS CFX, Axial Fan, Computational Fluid Dynamics (CFD), Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 112119446 Thermal Evaluation of Printed Circuit Board Design Options and Voids in Solder Interface by a Simulation Tool
Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles
Abstract:
Quad Flat No-Lead (QFN) packages have become very popular for turners, converters and audio amplifiers, among others applications, needing efficient power dissipation in small footprints. Since semiconductor junction temperature (TJ) is a critical parameter in the product quality. And to ensure that die temperature does not exceed the maximum allowable TJ, a thermal analysis conducted in an earlier development phase is essential to avoid repeated re-designs process with huge losses in cost and time. A simulation tool capable to estimate die temperature of components with QFN package was developed. Allow establish a non-empirical way to define an acceptance criterion for amount of voids in solder interface between its exposed pad and Printed Circuit Board (PCB) to be applied during industrialization process, and evaluate the impact of PCB designs parameters. Targeting PCB layout designer as an end user for the application, a user-friendly interface (GUI) was implemented allowing user to introduce design parameters in a convenient and secure way and hiding all the complexity of finite element simulation process. This cost effective tool turns transparent a simulating process and provides useful outputs after acceptable time, which can be adopted by PCB designers, preventing potential risks during the design stage and make product economically efficient by not oversizing it. This article gathers relevant information related to the design and implementation of the developed tool, presenting a parametric study conducted with it. The simulation tool was experimentally validated using a Thermal-Test-Chip (TTC) in a QFN open-cavity, in order to measure junction temperature (TJ) directly on the die under controlled and knowing conditions. Providing a short overview about standard thermal solutions and impacts in exposed pad packages (i.e. QFN), accurately describe the methods and techniques that the system designer should use to achieve optimum thermal performance, and demonstrate the effect of system-level constraints on the thermal performance of the design.Keywords: Quad Flat No-Lead packages, exposed pads, junction temperature, thermal management and measurements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19319445 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include the cost of infrastructure, personnel, and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion, and various indirect costs in terms of air transport. This research aims to predict the probabilistic crash prediction of vehicles using Machine Learning due to natural and structural reasons by excluding spontaneous reasons, like overspeeding, etc., in the United States. These factors range from meteorological elements such as weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity, to human-made structures, like road structure components such as Bumps, Roundabouts, No Exit, Turning Loops, Give Away, etc. The probabilities are categorized into ten distinct classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes in all states collected by the US government. The probability of the crash was determined by employing Multinomial Expected Value, and a classification label was assigned accordingly. We applied three classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-depth insights through exploratory data analysis.
Keywords: Road safety, crash prediction, exploratory analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 929444 High Accuracy ESPRIT-TLS Technique for Wind Turbine Fault Discrimination
Authors: Saad Chakkor, Mostafa Baghouri, Abderrahmane Hajraoui
Abstract:
ESPRIT-TLS method appears a good choice for high resolution fault detection in induction machines. It has a very high effectiveness in the frequency and amplitude identification. Contrariwise, it presents a high computation complexity which affects its implementation in real time fault diagnosis. To avoid this problem, a Fast-ESPRIT algorithm that combined the IIR band-pass filtering technique, the decimation technique and the original ESPRIT-TLS method was employed to enhance extracting accurately frequencies and their magnitudes from the wind stator current with less computation cost. The proposed algorithm has been applied to verify the wind turbine machine need in the implementation of an online, fast, and proactive condition monitoring. This type of remote and periodic maintenance provides an acceptable machine lifetime, minimize its downtimes and maximize its productivity. The developed technique has evaluated by computer simulations under many fault scenarios. Study results prove the performance of Fast- ESPRIT offering rapid and high resolution harmonics recognizing with minimum computation time and less memory cost.
Keywords: Spectral Estimation, ESPRIT-TLS, Real Time, Diagnosis, Wind Turbine Faults, Band-Pass Filtering, Decimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261