Search results for: retention time.
5644 Time-Cost-Quality Trade-off Software by using Simplified Genetic Algorithm for Typical Repetitive Construction Projects
Authors: Refaat H. Abd El Razek, Ahmed M. Diab, Sherif M. Hafez, Remon F. Aziz
Abstract:
Time-Cost Optimization "TCO" is one of the greatest challenges in construction project planning and control, since the optimization of either time or cost, would usually be at the expense of the other. Since there is a hidden trade-off relationship between project and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of the schedule compression. Recently third dimension in trade-off analysis is taken into consideration that is quality of the projects. Few of the existing algorithms are applied in a case of construction project with threedimensional trade-off analysis, Time-Cost-Quality relationships. The objective of this paper is to presents the development of a practical software system; that named Automatic Multi-objective Typical Construction Resource Optimization System "AMTCROS". This system incorporates the basic concepts of Line Of Balance "LOB" and Critical Path Method "CPM" in a multi-objective Genetic Algorithms "GAs" model. The main objective of this system is to provide a practical support for typical construction planners who need to optimize resource utilization in order to minimize project cost and duration while maximizing its quality simultaneously. The application of these research developments in planning the typical construction projects holds a strong promise to: 1) Increase the efficiency of resource use in typical construction projects; 2) Reduce construction duration period; 3) Minimize construction cost (direct cost plus indirect cost); and 4) Improve the quality of newly construction projects. A general description of the proposed software for the Time-Cost-Quality Trade-Off "TCQTO" is presented. The main inputs and outputs of the proposed software are outlined. The main subroutines and the inference engine of this software are detailed. The complexity analysis of the software is discussed. In addition, the verification, and complexity of the proposed software are proved and tested using a real case study.
Keywords: Project management, typical (repetitive) large scale projects, line of balance, multi-objective optimization, genetic algorithms, time-cost-quality trade-offs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30655643 Statistical Optimization of Enzymatic Hydrolysis of Potato (Solanum tuberosum) Starch by Immobilized α-amylase
Authors: N.Peatciyammal, B.Balachandar, M.Dinesh Kumar, K.Tamilarasan, C.Muthukumaran
Abstract:
Enzymatic hydrolysis of starch from natural sources finds potential application in commercial production of alcoholic beverage and bioethanol. In this study the effect of starch concentration, temperature, time and enzyme concentration were studied and optimized for hydrolysis of Potato starch powder (of mesh 80/120) into glucose syrup by immobilized (using Sodium arginate) α-amylase using central composite design. The experimental result on enzymatic hydrolysis of Potato starch was subjected to multiple linear regression analysis using MINITAB 14 software. Positive linear effect of starch concentration, enzyme concentration and time was observed on hydrolysis of Potato starch by α-amylase. The statistical significance of the model was validated by F-test for analysis of variance (p ≤ 0.01). The optimum value of starch concentration, enzyme concentration, temperature, time and were found to be 6% (w/v), 2% (w/v), 40°C and 80min respectively. The maximum glucose yield at optimum condition was 2.34 mg/mL.Keywords: Alcoholic beverage, Central Composite Design, Enzymatic hydrolysis, Glucose yield, Potato Starch.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60385642 Reduction of Linear Time-Invariant Systems Using Routh-Approximation and PSO
Authors: S. Panda, S. K. Tomar, R. Prasad, C. Ardil
Abstract:
Order reduction of linear-time invariant systems employing two methods; one using the advantages of Routh approximation and other by an evolutionary technique is presented in this paper. In Routh approximation method the denominator of the reduced order model is obtained using Routh approximation while the numerator of the reduced order model is determined using the indirect approach of retaining the time moments and/or Markov parameters of original system. By this method the reduced order model guarantees stability if the original high order model is stable. In the second method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical examples.
Keywords: Model Order Reduction, Markov Parameters, Routh Approximation, Particle Swarm Optimization, Integral Squared Error, Steady State Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32885641 A Continuous Real-Time Analytic for Predicting Instability in Acute Care Rapid Response Team Activations
Authors: Ashwin Belle, Bryce Benson, Mark Salamango, Fadi Islim, Rodney Daniels, Kevin Ward
Abstract:
A reliable, real-time, and non-invasive system that can identify patients at risk for hemodynamic instability is needed to aid clinicians in their efforts to anticipate patient deterioration and initiate early interventions. The purpose of this pilot study was to explore the clinical capabilities of a real-time analytic from a single lead of an electrocardiograph to correctly distinguish between rapid response team (RRT) activations due to hemodynamic (H-RRT) and non-hemodynamic (NH-RRT) causes, as well as predict H-RRT cases with actionable lead times. The study consisted of a single center, retrospective cohort of 21 patients with RRT activations from step-down and telemetry units. Through electronic health record review and blinded to the analytic’s output, each patient was categorized by clinicians into H-RRT and NH-RRT cases. The analytic output and the categorization were compared. The prediction lead time prior to the RRT call was calculated. The analytic correctly distinguished between H-RRT and NH-RRT cases with 100% accuracy, demonstrating 100% positive and negative predictive values, and 100% sensitivity and specificity. In H-RRT cases, the analytic detected hemodynamic deterioration with a median lead time of 9.5 hours prior to the RRT call (range 14 minutes to 52 hours). The study demonstrates that an electrocardiogram (ECG) based analytic has the potential for providing clinical decision and monitoring support for caregivers to identify at risk patients within a clinically relevant timeframe allowing for increased vigilance and early interventional support to reduce the chances of continued patient deterioration.
Keywords: Critical care, early warning systems, emergency medicine, heart rate variability, hemodynamic instability, rapid response team.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16255640 Effectiveness of Cellular Phone with Active RFID Tag for Evacuation - The Case of Evacuation from the Underground Shopping Mall of Tenjin
Authors: Masatora Daito, Noriyuki Tanida
Abstract:
The underground shopping mall has the constructional problem of the fire evacuation. Also, the people sometimes lose their direction and information of current time in the mall. If the emergencies such as terrorist explosions or gas explosions are happened, they have to go out soon. Under such circumstances, inside of the mall has high risk for life. In this research, the authors propose a way that he/she can go out from the underground shopping mall quickly. If the narrow exits are discovered by using active RFID (Radio Frequency Identification) tags and using cellular phones, they can evacuate as soon as possible. To verify this hypothesis, the authors design the model and carry out the agent-based simulation. They treat, as a case study, the Tenjin mall in Fukuoka Prefecture in Japan. The result of the simulation is that the case of the pedestrian with using active RFID tags and cellular phones reduced the amount of time to spend on the evacuation. Even if the diffusion of RFID tags and cellular phones was not perfect, they could show the effectiveness of reducing the time of evacuation.Keywords: Evacuation, active RFID tag and cellular phone, underground shopping mall, agent-based simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15185639 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: Building system, time series, diagnosis, outliers, delay, data gap.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9035638 General Purpose Graphic Processing Units Based Real Time Video Tracking System
Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai
Abstract:
Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.
Keywords: Connected components, Embrace threads, Local weighted kernel, Structuring element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11715637 An Efficient Multi Join Algorithm Utilizing a Lattice of Double Indices
Authors: Hanan A. M. Abd Alla, Lilac A. E. Al-Safadi
Abstract:
In this paper, a novel multi join algorithm to join multiple relations will be introduced. The novel algorithm is based on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.Keywords: Multi join, Relation, Lattice, Join indices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12995636 Optimization of the Co-Precipitation of Industrial Waste Metals in a Continuous Reactor System
Authors: Thomas S. Abia II, Citlali Garcia-Saucedo
Abstract:
A continuous copper precipitation treatment (CCPT) system was conceived at Intel Chandler Site to serve as a first-of-kind (FOK) facility-scale waste copper (Cu), nickel (Ni), and manganese (Mn) co-precipitation facility. The process was designed to treat highly variable wastewater discharged from a substrate packaging research factory. The paper discusses metals co-precipitation induced by internal changes for manufacturing facilities that lack the capacity for hardware expansion due to real estate restrictions, aggressive schedules, or budgetary constraints. Herein, operating parameters such as pH and oxidation reduction potential (ORP) were examined to analyze the ability of the CCPT System to immobilize various waste metals. Additionally, influential factors such as influent concentrations and retention times were investigated to quantify the environmental variability against system performance. A total of 2,027 samples were analyzed and statistically evaluated to measure the performance of CCPT that was internally retrofitted for Mn abatement to meet environmental regulations. In order to enhance the consistency of the influent, a separate holding tank was cannibalized from another system to collect and slow-feed the segregated Mn wastewater from the factory into CCPT. As a result, the baseline influent Mn decreased from 17.2+18.7 mg1L-1 at pre-pilot to 5.15+8.11 mg1L-1 post-pilot (70.1% reduction). Likewise, the pre-trial and post-trial average influent Cu values to CCPT were 52.0+54.6 mg1L-1 and 33.9+12.7 mg1L-1, respectively (34.8% reduction). However, the raw Ni content of 0.97+0.39 mg1L-1 at pre-pilot increased to 1.06+0.17 mg1L-1 at post-pilot. The average Mn output declined from 10.9+11.7 mg1L-1 at pre-pilot to 0.44+1.33 mg1L-1 at post-pilot (96.0% reduction) as a result of the pH and ORP operating setpoint changes. In similar fashion, the output Cu quality improved from 1.60+5.38 mg1L-1 to 0.55+1.02 mg1L-1 (65.6% reduction) while the Ni output sustained a 50% enhancement during the pilot study (0.22+0.19 mg1L-1 reduced to 0.11+0.06 mg1L-1). pH and ORP were shown to be significantly instrumental to the precipitative versatility of the CCPT System.
Keywords: Copper, co-precipitation, industrial wastewater treatment, manganese, optimization, pilot study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9835635 SDVAR Algorithm for Detecting Fraud in Telecommunications
Authors: Fatimah Almah Saaid, Darfiana Nur, Robert King
Abstract:
This paper presents a procedure for estimating VAR using Sequential Discounting VAR (SDVAR) algorithm for online model learning to detect fraudulent acts using the telecommunications call detailed records (CDR). The volatility of the VAR is observed allowing for non-linearity, outliers and change points based on the works of [1]. This paper extends their procedure from univariate to multivariate time series. A simulation and a case study for detecting telecommunications fraud using CDR illustrate the use of the algorithm in the bivariate setting.Keywords: Telecommunications Fraud, SDVAR Algorithm, Multivariate time series, Vector Autoregressive, Change points.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22565634 Analyzing the Market Growth in API Economy Using Time-Evolving Model
Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata
Abstract:
API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding dynamics of a market of API economy under strategies of participants is crucial to fully maximize the values of API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that, when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competitions and the profit of the platform increases.
Keywords: API Economy, ecosystem, platform, API providers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2435633 Optimizing Organizational Performance: The Critical Role of Headcount Budgeting in Strategic Alignment and Financial Stability
Authors: Shobhit Mittal
Abstract:
Headcount budgeting stands as a pivotal element in organizational financial management, extending beyond traditional budgeting to encompass strategic resource allocation for workforce-related expenses. This process is integral to maintaining financial stability and fostering a productive workforce, requiring a comprehensive analysis of factors such as market trends, business growth projections, and evolving workforce skill requirements. It demands a collaborative approach, primarily involving Human Resources (HR) and finance departments, to align workforce planning with an organization's financial capabilities and strategic objectives. The dynamic nature of headcount budgeting necessitates continuous monitoring and adjustment in response to economic fluctuations, business strategy shifts, technological advancements, and market dynamics. Its significance in talent management is also highlighted, aligning financial planning with talent acquisition and retention strategies to ensure a competitive edge in the market. The consequences of incorrect headcount budgeting are explored, showing how it can lead to financial strain, operational inefficiencies, and hindered strategic objectives. Examining case studies like IBM's strategic workforce rebalancing and Microsoft's shift for long-term success, the importance of aligning headcount budgeting with organizational goals is underscored. These examples illustrate that effective headcount budgeting transcends its role as a financial tool, emerging as a strategic element crucial for an organization's success. This necessitates continuous refinement and adaptation to align with evolving business goals and market conditions, highlighting its role as a key driver in organizational success and sustainability.
Keywords: Strategic planning, fiscal budget, headcount planning, resource allocation, financial management, decision-making, operational efficiency, risk management, headcount budget.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585632 Save Lives: The Application of Geolocation-Awareness Service in Iranian Pre-Hospital EMS Information Management System
Authors: Somayeh Abedian, Pirhossein Kolivand, Hamid Reza Lornejad, Amin Karampour, Ebrahim Keshavarz Safari
Abstract:
For emergency and relief service providers such as pre-hospital emergencies, quick arrival at the scene of an accident or any EMS mission is one of the most important requirements of effective service delivery. EMS Response time (the interval between the time of the call and the time of arrival on scene) is a critical factor in determining the quality of pre-hospital Emergency Medical Services (EMS). This is especially important for heart attack, stroke, or accident patients that seconds are vital in saving their lives. Location-based e-services can be broadly defined as any service that provides information pertinent to the current location of an active mobile handset or precise address of landline phone call at a specific time window, regardless of the underlying delivery technology used to convey the information. According to research, one of the effective methods of meeting this goal is determining the location of the caller via the cooperation of landline and mobile phone operators in the country. The follow-up of the Communications Regulatory Authority (CRA) organization has resulted in the receipt of two separate secured electronic web services. Thus, to ensure human privacy, a secure technical architecture was required for launching the services in the pre-hospital EMS information management system. In addition, to quicken medics’ arrival at the patient's bedside, rescue vehicles should make use of an intelligent transportation system to estimate road traffic using a GPS-based mobile navigation system independent of the Internet. This paper seeks to illustrate the architecture of the practical national model used by the Iranian EMS organization.
Keywords: response time, geographic location inquiry service, location-based services, emergency medical services information system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4785631 Improved Asymptotic Stability Criteria for Uncertain Neutral Systems with Time-varying Discrete Delays
Authors: Changchun Shen, Shouming Zhong
Abstract:
This paper investigates the robust stability of uncertain neutral system with time-varying delay. By using Lyapunov method and linear matrix inequality technology, new delay-dependent stability criteria are obtained and formulated in terms of linear matrix inequalities (LMIs), which can be easy to check the robust stability of the considered systems. Numerical examples are given to indicate significant improvements over some existing results.
Keywords: Neutral system, linear matrix inequalities, Lyapunov, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15145630 Visual Inspection of Work Piece with a Complex Shape by Means of Robot Manipulator
Authors: A. Y. Bani Hashim, N. S. A. Ramdan
Abstract:
Inconsistency in manual inspection is real because humans get tired after some time. Recent trends show that automatic inspection is more appealing for mass production inspections. In such as a case, a robot manipulator seems the best candidate to run a dynamic visual inspection. The purpose of this work is to estimate the optimum workspace where a robot manipulator would perform a visual inspection process onto a work piece where a camera is attached to the end effector. The pseudo codes for the planned path are derived from the number of tool transit points, the delay time at the transit points, the process cycle time, and the configuration space that the distance between the tool and the work piece. It is observed that express start and swift end are acceptable in a robot program because applicable works usually in existence during these moments. However, during the mid-range cycle, there are always practical tasks programmed to be executed. For that reason, it is acceptable to program the robot such as that speedy alteration of actuator displacement is avoided. A dynamic visual inspection system using a robot manipulator seems practical for a work piece with a complex shape.
Keywords: Robot manipulator, Visual inspection, Work piece, Trajectory planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16615629 Performances Comparison of Neural Architectures for On-Line Speed Estimation in Sensorless IM Drives
Authors: K.Sedhuraman, S.Himavathi, A.Muthuramalingam
Abstract:
The performance of sensor-less controlled induction motor drive depends on the accuracy of the estimated speed. Conventional estimation techniques being mathematically complex require more execution time resulting in poor dynamic response. The nonlinear mapping capability and powerful learning algorithms of neural network provides a promising alternative for on-line speed estimation. The on-line speed estimator requires the NN model to be accurate, simpler in design, structurally compact and computationally less complex to ensure faster execution and effective control in real time implementation. This in turn to a large extent depends on the type of Neural Architecture. This paper investigates three types of neural architectures for on-line speed estimation and their performance is compared in terms of accuracy, structural compactness, computational complexity and execution time. The suitable neural architecture for on-line speed estimation is identified and the promising results obtained are presented.Keywords: Sensorless IM drives, rotor speed estimators, artificial neural network, feed- forward architecture, single neuron cascaded architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14585628 Effect of Exercise on Sexual Behavior and Semen Quality of Sahiwal Bulls
Authors: Abdelrasoul, Khalid Ahmed Elrabie
Abstract:
The study was conducted on Sahiwal cattle bulls maintained at the Artificial Breeding Complex, NDRI, Karnal, Hayana, India, to determine the effect of exercise on the sexual behavior and semen quality. Fourteen Sahiwal bulls were classified into two groups of seven each. Group-1, bulls were exercised by walking in a bull exerciser once a week one hour before semen collection, whereas bulls in group-2 were exercised daily. Sexual behavior and semen quality traits studied were: Reaction time (RT), Dismounting time (DMT), Total time taken in mounts (TTTM), Flehmen response (FR), Erection Score (ES), Protrusion Score (PS), Intensity of thrust (ITS), Temperament Score (TS), Libido Score (LS), Semen volume, Physical appearance, Mass activity, Initial progressive motility, Non-eosinophilic spermatozoa count (NESC) and post thaw motility percent. Data were analyzed by least squares technique. Group-2 showed significantly (p < 0.01) higher value in RT (sec), DMT (sec), TTTM (sec), ES, PS, ITS, LS, semen volume, semen color density and mass activity.
Keywords: Exercise, Sahiwal bulls, semen quality, sexual behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12465627 Object-Oriented Simulation of Simulating Anticipatory Systems
Authors: Eugene Kindler
Abstract:
The present paper is oriented to problems of simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. A certain analogy between use of simulation and imagining will be applied to make the explication more comprehensible. The paper will be completed by notes of problems and by some existing applications. The problems consist in the fact that simulation of the mentioned anticipatory systems end is simulation of simulating systems, i.e. in computer models handling two or more modeled time axes that should be mapped to real time flow in a nondescent manner. Languages oriented to objects, processes and blocks can be used to surmount the problems.
Keywords: Anticipatory systems, Nested computer models, Discrete event simulation, Simula.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14395626 Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm
Authors: Jehad A. H. Hammad, Nur'Aini binti Abdul Rashid
Abstract:
With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.Keywords: Biological sequence, Database index, N-gram indexing, Parallel computing, Sequence retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21365625 Measuring Pressure Wave Velocity in a Hydraulic System
Authors: Lari Kela, Pekka Vähäoja
Abstract:
Pressure wave velocity in a hydraulic system was determined using piezo pressure sensors without removing fluid from the system. The measurements were carried out in a low pressure range (0.2 – 6 bar) and the results were compared with the results of other studies. This method is not as accurate as measurement with separate measurement equipment, but the fluid is in the actual machine the whole time and the effect of air is taken into consideration if air is present in the system. The amount of air is estimated by calculations and comparisons between other studies. This measurement equipment can also be installed in an existing machine and it can be programmed so that it measures in real time. Thus, it could be used e.g. to control dampers.Keywords: Bulk modulus, pressure wave, sound velocity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43025624 Observer Design for Chaos Synchronization of Time-delayed Power Systems
Authors: Jui-Sheng Lin, Yi-Sung Yang, Meei-Ling Hung, Teh-Lu Liao, Jun-Juh Yan
Abstract:
The global chaos synchronization for a class of time-delayed power systems is investigated via observer-based approach. By employing the concepts of quadratic stability theory and generalized system model, a new sufficient criterion for constructing an observer is deduced. In contrast to the previous works, this paper proposes a theoretical and systematic design procedure to realize chaos synchronization for master-slave power systems. Finally, an illustrative example is given to show the applicability of the obtained scheme.
Keywords: Chaos, Synchronization, Quadratic stability theory, Observer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17215623 Daily Site Risks Associated with Construction Projects and On-spot Corrective Measurements: Case Study of Revamping Projects in Kuwait Oil Company Fields Area
Authors: Yousef S. Al-Othman
Abstract:
The growth and expansion of the industrial facilities comes proportional to the market increasing demand of products and services. Furthermore, raw material producers such as oil companies usually undergo massive revamping projects to maintain a synchronized supply. These revamping projects are usually delivered through challenging construction projects held and associated with daily site risks related to the construction process. Henceforth, a case study related to these risks and corresponding on-spot corrective measurements has been made on a certain number of construction project contractors at Kuwait Oil Company (KOC) to derive the benefits and overall effectiveness of the on-spot corrective measurements during the construction phase of a project, and how would the same help in avoiding major incidents, ensuring a smooth, cost effective and on time delivery of the project. Findings of this case study shall have an added value to the overall risk management process by minimizing the daily site risks that may affect the project lead time, resulting in an undisturbed on-site construction process.
Keywords: Oil and gas, risk management, construction projects, project lead time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8925622 Performance Comparison of Prim’s and Ant Colony Optimization Algorithm to Select Shortest Path in Case of Link Failure
Authors: Rimmy Yadav, Avtar Singh
Abstract:
Ant Colony Optimization (ACO) is a promising modern approach to the unused combinatorial optimization. Here ACO is applied to finding the shortest during communication link failure. In this paper, the performances of the prim’s and ACO algorithm are made. By comparing the time complexity and program execution time as set of parameters, we demonstrate the pleasant performance of ACO in finding excellent solution to finding shortest path during communication link failure.Keywords: Ant colony optimization, link failure, prim’s algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21845621 Effect of Tempering Temperature and Time on the Corrosion Behaviour of 304 and 316 Austenitic Stainless Steels in Oxalic Acid
Authors: Ayo S. Afolabi, Johannes H. Potgieter, Ambali S. Abdulkareem, Nonhlanhla Fungura
Abstract:
The effect of different tempering temperatures and heat treatment times on the corrosion resistance of austenitic stainless steels in oxalic acid was studied in this work using conventional weight loss and electrochemical measurements. Typical 304 and 316 stainless steel samples were tempered at 150oC, 250oC and 350oC after being austenized at 1050oC for 10 minutes. These samples were then immersed in 1.0M oxalic acid and their weight losses were measured at every five days for 30 days. The results show that corrosion of both types of ASS samples increased with an increase in tempering temperature and time and this was due to the precipitation of chromium carbides at the grain boundaries of these metals. Electrochemical results also confirm that the 304 ASS is more susceptible to corrosion than 316 ASS in this medium. This is attributed to the molybdenum in the composition of the latter. The metallographic images of these samples showed non–uniform distribution of precipitated chromium carbides at the grain boundaries of these metals and unevenly distributed carbides and retained austenite phases which cause galvanic effects in the medium.
Keywords: ASS, corrosion, oxalic acid, tempering, temperature, time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34195620 Managing Iterations in Product Design and Development
Authors: K. Aravindhan, Trishit Bandyopadhyay, Mahesh Mehendale, Supriya Kumar De
Abstract:
The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Keywords: Decision Points, Iteration, Product Design, Rework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21925619 Detecting the Edge of Multiple Images in Parallel
Authors: Prakash K. Aithal, U. Dinesh Acharya, Rajesh Gopakumar
Abstract:
Edge is variation of brightness in an image. Edge detection is useful in many application areas such as finding forests, rivers from a satellite image, detecting broken bone in a medical image etc. The paper discusses about finding edge of multiple aerial images in parallel. The proposed work tested on 38 images 37 colored and one monochrome image. The time taken to process N images in parallel is equivalent to time taken to process 1 image in sequential. Message Passing Interface (MPI) and Open Computing Language (OpenCL) is used to achieve task and pixel level parallelism respectively.Keywords: Edge detection, multicore, GPU, openCL, MPI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23395618 Combined Simulated Annealing and Genetic Algorithm to Solve Optimization Problems
Authors: Younis R. Elhaddad
Abstract:
Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.
Keywords: Genetic Algorithm, Optimization problems, Simulated Annealing, Traveling Salesman Problem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34405617 Calculation of Reorder Point Level under Stochastic Parameters: A Case Study in Healthcare Area
Authors: Serap Akcan, Ali Kokangul
Abstract:
We consider a single-echelon, single-item inventory system where both demand and lead-time are stochastic. Continuous review policy is used to control the inventory system. The objective is to calculate the reorder point level under stochastic parameters. A case study is presented in Neonatal Intensive Care Unit.Keywords: Inventory control system, reorder point level, stochastic demand, stochastic lead time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35345616 Estimation of Buffer Size of Internet Gateway Server via G/M/1 Queuing Model
Authors: Dr. L.K. Singh, Dr. R. M. L, Riktesh Srivastava
Abstract:
How to efficiently assign system resource to route the Client demand by Gateway servers is a tricky predicament. In this paper, we tender an enhanced proposal for autonomous recital of Gateway servers under highly vibrant traffic loads. We devise a methodology to calculate Queue Length and Waiting Time utilizing Gateway Server information to reduce response time variance in presence of bursty traffic. The most widespread contemplation is performance, because Gateway Servers must offer cost-effective and high-availability services in the elongated period, thus they have to be scaled to meet the expected load. Performance measurements can be the base for performance modeling and prediction. With the help of performance models, the performance metrics (like buffer estimation, waiting time) can be determined at the development process. This paper describes the possible queue models those can be applied in the estimation of queue length to estimate the final value of the memory size. Both simulation and experimental studies using synthesized workloads and analysis of real-world Gateway Servers demonstrate the effectiveness of the proposed system.Keywords: Gateway Server, G/M/1 Queuing Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15945615 Multiple Job Shop-Scheduling using Hybrid Heuristic Algorithm
Authors: R.A.Mahdavinejad
Abstract:
In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.
Keywords: Job shops scheduling, Priority dispatching rules, Makespan, Hybrid heuristic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669