Search results for: tool tuning
4143 Development of Map of Gridded Basin Flash Flood Potential Index: GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue Provinces
Authors: Le Xuan Cau
Abstract:
Flash flood is occurred in short time rainfall interval: from 1 hour to 12 hours in small and medium basins. Flash floods typically have two characteristics: large water flow and big flow velocity. Flash flood is occurred at hill valley site (strip of lowland of terrain) in a catchment with large enough distribution area, steep basin slope, and heavy rainfall. The risk of flash floods is determined through Gridded Basin Flash Flood Potential Index (GBFFPI). Flash Flood Potential Index (FFPI) is determined through terrain slope flash flood index, soil erosion flash flood index, land cover flash floods index, land use flash flood index, rainfall flash flood index. Determining GBFFPI, each cell in a map can be considered as outlet of a water accumulation basin. GBFFPI of the cell is determined as basin average value of FFPI of the corresponding water accumulation basin. Based on GIS, a tool is developed to compute GBFFPI using ArcObjects SDK for .NET. The maps of GBFFPI are built in two types: GBFFPI including rainfall flash flood index (real time flash flood warning) or GBFFPI excluding rainfall flash flood index. GBFFPI Tool can be used to determine a high flash flood potential site in a large region as quick as possible. The GBFFPI is improved from conventional FFPI. The advantage of GBFFPI is that GBFFPI is taking into account the basin response (interaction of cells) and determines more true flash flood site (strip of lowland of terrain) while conventional FFPI is taking into account single cell and does not consider the interaction between cells. The GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue is built and exported to Google Earth. The obtained map proves scientific basis of GBFFPI.Keywords: ArcObjects SDK for NET, basin average value of FFPI, gridded basin flash flood potential index, GBFFPI map
Procedia PDF Downloads 3814142 3-D Strain Imaging of Nanostructures Synthesized via CVD
Authors: Sohini Manna, Jong Woo Kim, Oleg Shpyrko, Eric E. Fullerton
Abstract:
CVD techniques have emerged as a promising approach in the formation of a broad range of nanostructured materials. The realization of many practical applications will require efficient and economical synthesis techniques that preferably avoid the need for templates or costly single-crystal substrates and also afford process adaptability. Towards this end, we have developed a single-step route for the reduction-type synthesis of nanostructured Ni materials using a thermal CVD method. By tuning the CVD growth parameters, we can synthesize morphologically dissimilar nanostructures including single-crystal cubes and Au nanostructures which form atop untreated amorphous SiO2||Si substrates. An understanding of the new properties that emerge in these nanostructures materials and their relationship to function will lead to for a broad range of magnetostrictive devices as well as other catalysis, fuel cell, sensor, and battery applications based on high-surface-area transition-metal nanostructures. We use coherent X-ray diffraction imaging technique to obtain 3-D image and strain maps of individual nanocrystals. Coherent x-ray diffractive imaging (CXDI) is a technique that provides the overall shape of a nanostructure and the lattice distortion based on the combination of highly brilliant coherent x-ray sources and phase retrieval algorithm. We observe a fine interplay of reduction of surface energy vs internal stress, which plays an important role in the morphology of nano-crystals. The strain distribution is influenced by the metal-substrate interface and metal-air interface, which arise due to differences in their thermal expansion. We find the lattice strain at the surface of the octahedral gold nanocrystal agrees well with the predictions of the Young-Laplace equation quantitatively, but exhibits a discrepancy near the nanocrystal-substrate interface resulting from the interface. The strain in the bottom side of the Ni nanocube, which is contacted on the substrate surface is compressive. This is caused by dissimilar thermal expansion coefficients between Ni nanocube and Si substrate. Research at UCSD support by NSF DMR Award # 1411335.Keywords: CVD, nanostructures, strain, CXRD
Procedia PDF Downloads 3934141 Passive Seismic in Hydrogeological Prospecting: The Case Study from Hard Rock and Alluvium Plain
Authors: Prarabdh Tiwari, M. Vidya Sagar, K. Bhima Raju, Joy Choudhury, Subash Chandra, E. Nagaiah, Shakeel Ahmed
Abstract:
Passive seismic, a wavefield interferometric imaging, low cost and rapid tool for subsurface investigation is used for various geotechnical purposes such as hydrocarbon exploration, seismic microzonation, etc. With the recent advancement, its application has also been extended to groundwater exploration by means of finding the bedrock depth. Council of Scientific & Industrial Research (CSIR)-National Geophysical Research Institute (NGRI) has experimented passive seismic studies along with electrical resistivity tomography for groundwater in hard rock (Choutuppal, Hyderabad). Passive Seismic with Electrical Resistivity (ERT) can give more clear 2-D subsurface image for Groundwater Exploration in Hard Rock area. Passive seismic data were collected using a Tromino, a three-component broadband seismometer, to measure background ambient noise and processed using GRILLA software. The passive seismic results are found corroborating with ERT (Electrical Resistivity Tomography) results. For data acquisition purpose, Tromino was kept over 30 locations consist recording of 20 minutes at each station. These location shows strong resonance frequency peak, suggesting good impedance contrast between different subsurface layers (ex. Mica rich Laminated layer, Weathered layer, granite, etc.) This paper presents signature of passive seismic for hard rock terrain. It has been found that passive seismic has potential application for formation characterization and can be used as an alternative tool for delineating litho-stratification in an urban condition where electrical and electromagnetic tools cannot be applied due to high cultural noise. In addition to its general application in combination with electrical and electromagnetic methods can improve the interpreted subsurface model.Keywords: passive seismic, resonant frequency, Tromino, GRILLA
Procedia PDF Downloads 1894140 Relearning to Learn: Approaching Sustainability by Incorporating Inuit Vernacular and Biomimicry Architecture Principles
Authors: Hakim Herbane
Abstract:
Efforts to achieve sustainability in architecture must prove their effectiveness despite various methods attempted. Biomimicry, which looks to successful natural models to promote sustainability and innovation, faces obstacles in implementing sustainability despite its restorative approach to the relationship between humans and nature. In Nunavik, Inuit communities are exploring a sustainable production system that aligns with their aspirations and meets their demands of human, technological, technical, economic, and ecological factors. Biomimicry holds promise in line with Inuit philosophy, but its failure to implement sustainability requires further investigations to remedy its deficiencies. Our literature review underscores the importance of involving the community in defining sustainability and determining the best methods for its implementation. Additionally, vernacular architecture shows valuable orientations for achieving sustainability. Moreover, reintegrating Inuit communities and their traditional architectural practices, which have successfully balanced their built environment's diverse needs and constraints, could pave the way for a sustainable Inuit-built environment in Nunavik and advance architectural biomimicry principles simultaneously. This research aims at establishing a sustainability monitoring tool for Nordic architectural process by analyzing Inuit vernacular and biomimetic architecture, in addition to the input of stakeholders involved in Inuit architecture production in Nunavik, especially Inuit. The goal is to create a practical tool (an index) to aid in designing sustainable architecture, taking into account environmental, social, and economic perspectives. Furthermore, the study seeks to authenticate strong, sustainable design principles of vernacular and biomimetic architectures. The literature review uncovered challenges and identified new opportunities. The forthcoming discourse will focus on the careful and considerate incorporation of Inuit communities’ perceptions and indigenous building practices into our methodology and the latest findings of our research.Keywords: sustainability, biomimicry, vernacular architecture, community involvement
Procedia PDF Downloads 544139 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development
Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam
Abstract:
The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae
Procedia PDF Downloads 1174138 The Significance of Translating Folklore in Teaching and Learning Open Distance e-Learning
Authors: M. A. Mabasa, O. Ramokolo, M. Z. Mnikathi, D. Mathabatha, T. Manyapelo
Abstract:
The study examines the importance of translating South African folklore from Oral into Written Literature in a Multilingual Education. Therefore, the study postulates that translation can be regarded as a valuable tool when oral and written literature is transmitted from one generation to another. The study entails that translation does not take place in a haphazard fashion; for that reason, skills such as translation principles are required to translate folklore significantly and effectively. The purpose of the study is to indicate the significance of using translation relating to folklore in teaching and learning. The study also observed that Modernism in literature should be shared amongst varieties of cultures because folklore is interactive in narrating stories, folktales and myths to sharpen the reader’s knowledge and intellect because they are informative and educative in nature. As a technological tool, the study points out that translation is of paramount importance in the sense that the meanings of different data can be made available in all South African official languages using oral and written forms of folklore. The study opines that tradition and customary beliefs and practices in the institution of higher learning. The study envisages the way in which literature of folklore can be juxtaposed to ensure that translated folklore is of quality assured standards. The study alludes that well-translated folklore can serve as oral and written literature, which may contribute to the child’s learning and acquisition of knowledge and insights during cognitive development toward maturity. Methodologically, the study selects a qualitative research approach and selects content analysis as an instrument for data gathering, which will be analyzed qualitatively in consideration of the significance of translating folklore as written and spoken literature in a documented way. The study reveals that the translation of folktales promotes functional multilingualism in high-function formal contexts like a university. The study emphasizes that translated and preserved literary folklore may serve as a language repository from one generation to another because of the archival and storage of information in the form of a term bank.Keywords: translation, editing, teaching, learning, folklores
Procedia PDF Downloads 354137 Indeterminacy: An Urban Design Tool to Measure Resilience to Climate Change, a Caribbean Case Study
Authors: Tapan Kumar Dhar
Abstract:
How well are our city forms designed to adapt to climate change and its resulting uncertainty? What urban design tools can be used to measure and improve resilience to climate change, and how would they do so? In addressing these questions, this paper considers indeterminacy, a concept originated in the resilience literature, to measure the resilience of built environments. In the realm of urban design, ‘indeterminacy’ can be referred to as built-in design capabilities of an urban system to serve different purposes which are not necessarily predetermined. An urban system, particularly that with a higher degree of indeterminacy, can enable the system to be reorganized and changed to accommodate new or unknown functions while coping with uncertainty over time. Underlying principles of this concept have long been discussed in the urban design and planning literature, including open architecture, landscape urbanism, and flexible housing. This paper argues that the concept indeterminacy holds the potential to reduce the impacts of climate change incrementally and proactively. With regard to sustainable development, both planning and climate change literature highly recommend proactive adaptation as it involves less cost, efforts, and energy than last-minute emergency or reactive actions. Nevertheless, the concept still remains isolated from resilience and climate change adaptation discourses even though the discourses advocate the incremental transformation of a system to cope with climatic uncertainty. This paper considers indeterminacy, as an urban design tool, to measure and increase resilience (and adaptive capacity) of Long Bay’s coastal settlements in Negril, Jamaica. Negril is one of the popular tourism destinations in the Caribbean highly vulnerable to sea-level rise and its associated impacts. This paper employs empirical information obtained from direct observation and informal interviews with local people. While testing the tool, this paper deploys an urban morphology study, which includes land use patterns and the physical characteristics of urban form, including street networks, block patterns, and building footprints. The results reveal that most resorts in Long Bay are designed for pre-determined purposes and offer a little potential to use differently if needed. Additionally, Negril’s street networks are found to be rigid and have limited accessibility to different points of interest. This rigidity can expose the entire infrastructure further to extreme climatic events and also impedes recovery actions after a disaster. However, Long Bay still has room for future resilient developments in other relatively less vulnerable areas. In adapting to climate change, indeterminacy can be reached through design that achieves a balance between the degree of vulnerability and the degree of indeterminacy: the more vulnerable a place is, the more indeterminacy is useful. This paper concludes with a set of urban design typologies to increase the resilience of coastal settlements.Keywords: climate change adaptation, resilience, sea-level rise, urban form
Procedia PDF Downloads 3674136 Quantum Information Scrambling and Quantum Chaos in Silicon-Based Fermi-Hubbard Quantum Dot Arrays
Authors: Nikolaos Petropoulos, Elena Blokhina, Andrii Sokolov, Andrii Semenov, Panagiotis Giounanlis, Xutong Wu, Dmytro Mishagli, Eugene Koskin, Robert Bogdan Staszewski, Dirk Leipold
Abstract:
We investigate entanglement and quantum information scrambling (QIS) by the example of a many-body Extended and spinless effective Fermi-Hubbard Model (EFHM and e-FHM, respectively) that describes a special type of quantum dot array provided by Equal1 labs silicon-based quantum computer. The concept of QIS is used in the framework of quantum information processing by quantum circuits and quantum channels. In general, QIS is manifest as the de-localization of quantum information over the entire quantum system; more compactly, information about the input cannot be obtained by local measurements of the output of the quantum system. In our work, we will first make an introduction to the concept of quantum information scrambling and its connection with the 4-point out-of-time-order (OTO) correlators. In order to have a quantitative measure of QIS we use the tripartite mutual information, in similar lines to previous works, that measures the mutual information between 4 different spacetime partitions of the system and study the Transverse Field Ising (TFI) model; this is used to quantify the dynamical spreading of quantum entanglement and information in the system. Then, we investigate scrambling in the quantum many-body Extended Hubbard Model with external magnetic field Bz and spin-spin coupling J for both uniform and thermal quantum channel inputs and show that it scrambles for specific external tuning parameters (e.g., tunneling amplitudes, on-site potentials, magnetic field). In addition, we compare different Hilbert space sizes (different number of qubits) and show the qualitative and quantitative differences in quantum scrambling as we increase the number of quantum degrees of freedom in the system. Moreover, we find a "scrambling phase transition" for a threshold temperature in the thermal case, that is, the temperature of the model that the channel starts to scramble quantum information. Finally, we make comparisons to the TFI model and highlight the key physical differences between the two systems and mention some future directions of research.Keywords: condensed matter physics, quantum computing, quantum information theory, quantum physics
Procedia PDF Downloads 1014135 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 3034134 Inverse Saturable Absorption in Non-linear Amplifying Loop Mirror Mode-Locked Fiber Laser
Authors: Haobin Zheng, Xiang Zhang, Yong Shen, Hongxin Zou
Abstract:
The research focuses on mode-locked fiber lasers with a non-linear amplifying loop mirror (NALM). Although these lasers have shown potential, they still have limitations in terms of low repetition rate. The self-starting of mode-locking in NALM is influenced by the cross-phase modulation (XPM) effect, which has not been thoroughly studied. The aim of this study is two-fold. First, to overcome the difficulties associated with increasing the repetition rate in mode-locked fiber lasers with NALM. Second, to analyze the influence of XPM on self-starting of mode-locking. The power distributions of two counterpropagating beams in the NALM and the differential non-linear phase shift (NPS) accumulations are calculated. The analysis is conducted from the perspective of NPS accumulation. The differential NPSs for continuous wave (CW) light and pulses in the fiber loop are compared to understand the inverse saturable absorption (ISA) mechanism during pulse formation in NALM. The study reveals a difference in differential NPSs between CW light and pulses in the fiber loop in NALM. This difference leads to an ISA mechanism, which has not been extensively studied in artificial saturable absorbers. The ISA in NALM provides an explanation for experimentally observed phenomena, such as active mode-locking initiation through tapping the fiber or fine-tuning light polarization. These findings have important implications for optimizing the design of NALM and reducing the self-starting threshold of high-repetition-rate mode-locked fiber lasers. This study contributes to the theoretical understanding of NALM mode-locked fiber lasers by exploring the ISA mechanism and its impact on self-starting of mode-locking. The research fills a gap in the existing knowledge regarding the XPM effect in NALM and its role in pulse formation. This study provides insights into the ISA mechanism in NALM mode-locked fiber lasers and its role in selfstarting of mode-locking. The findings contribute to the optimization of NALM design and the reduction of self-starting threshold, which are essential for achieving high-repetition-rate operation in fiber lasers. Further research in this area can lead to advancements in the field of mode-locked fiber lasers with NALM.Keywords: inverse saturable absorption, NALM, mode-locking, non-linear phase shift
Procedia PDF Downloads 1014133 Optimized Passive Heating for Multifamily Dwellings
Authors: Joseph Bostick
Abstract:
A method of decreasing the heating load of HVAC systems in a single-dwelling model of a multifamily building, by controlling movable insulation through the optimization of flux, time, surface incident solar radiation, and temperature thresholds. Simulations are completed using a co-simulation between EnergyPlus and MATLAB as an optimization tool to find optimal control thresholds. Optimization of the control thresholds leads to a significant decrease in total heating energy expenditure.Keywords: energy plus, MATLAB, simulation, energy efficiency
Procedia PDF Downloads 1764132 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status
Authors: Rosa Figueroa, Christopher Flores
Abstract:
Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm
Procedia PDF Downloads 2984131 Response of Lepidium Sativum to Ionic Toxicity
Authors: M. F. El-Barghathi, R. El-Tajouri
Abstract:
The effect of different concentrations of cadmium sulfate "CdSO4" (0.0, 10, 50, 100, 500 ppm) was tested on seed germination, seedling elongation and growth of Lepidium sativum (garden cress) plants. Results indicated that seed germination and seedling elongation were not inhibited by different concentrations of CdSO4. This could suggest that, Lepidium sativum may be used as a phyto remediation tool of soils contaminated with cadmium.Keywords: Lepidium sativum, heavy metals, ionic toxicity, phytoremediation
Procedia PDF Downloads 5564130 Improve Divers Tracking and Classification in Sonar Images Using Robust Diver Wake Detection Algorithm
Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy
Abstract:
Harbor protection systems are so important. The need for automatic protection systems has increased over the last years. Diver detection active sonar has great significance. It used to detect underwater threats such as divers and autonomous underwater vehicle. To automatically detect such threats the sonar image is processed by algorithms. These algorithms used to detect, track and classify of underwater objects. In this work, divers tracking and classification algorithm is improved be proposing a robust wake detection method. To detect objects the sonar images is normalized then segmented based on fixed threshold. Next, the centroids of the segments are found and clustered based on distance metric. Then to track the objects linear Kalman filter is applied. To reduce effect of noise and creation of false tracks, the Kalman tracker is fine tuned. The tuning is done based on our active sonar specifications. After the tracks are initialed and updated they are subjected to a filtering stage to eliminate the noisy and unstable tracks. Also to eliminate object with a speed out of the diver speed range such as buoys and fast boats. Afterwards the result tracks are subjected to a classification stage to deiced the type of the object been tracked. Here the classification stage is to deice wither if the tracked object is an open circuit diver or a close circuit diver. At the classification stage, a small area around the object is extracted and a novel wake detection method is applied. The morphological features of the object with his wake is extracted. We used support vector machine to find the best classifier. The sonar training images and the test images are collected by ARMELSAN Defense Technologies Company using the portable diver detection sonar ARAS-2023. After applying the algorithm to the test sonar data, we get fine and stable tracks of the divers. The total classification accuracy achieved with the diver type is 97%.Keywords: harbor protection, diver detection, active sonar, wake detection, diver classification
Procedia PDF Downloads 2384129 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations
Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li
Abstract:
The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling
Procedia PDF Downloads 1584128 Cost Based Analysis of Risk Stratification Tool for Prediction and Management of High Risk Choledocholithiasis Patients
Authors: Shreya Saxena
Abstract:
Background: Choledocholithiasis is a common complication of gallstone disease. Risk scoring systems exist to guide the need for further imaging or endoscopy in managing choledocholithiasis. We completed an audit to review the American Society for Gastrointestinal Endoscopy (ASGE) scoring system for prediction and management of choledocholithiasis against the current practice at a tertiary hospital to assess its utility in resource optimisation. We have now conducted a cost focused sub-analysis on patients categorized high-risk for choledocholithiasis according to the guidelines to determine any associated cost benefits. Method: Data collection from our prior audit was used to retrospectively identify thirteen patients considered high-risk for choledocholithiasis. Their ongoing management was mapped against the guidelines. Individual costs for the key investigations were obtained from our hospital financial data. Total cost for the different management pathways identified in clinical practice were calculated and compared against predicted costs associated with recommendations in the guidelines. We excluded the cost of laparoscopic cholecystectomy and considered a set figure for per day hospital admission related expenses. Results: Based on our previous audit data, we identified a77% positive predictive value for the ASGE risk stratification tool to determine patients at high-risk of choledocholithiasis. 47% (6/13) had an magnetic resonance cholangiopancreatography (MRCP) prior to endoscopic retrograde cholangiopancreatography (ERCP), whilst 53% (7/13) went straight for ERCP. The average length of stay in the hospital was 7 days, with an additional day and cost of £328.00 (£117 for ERCP) for patients awaiting an MRCP prior to ERCP. Per day hospital admission was valued at £838.69. When calculating total cost, we assumed all patients had admission bloods and ultrasound done as the gold standard. In doing an MRCP prior to ERCP, there was a 130% increase in cost incurred (£580.04 vs £252.04) per patient. When also considering hospital admission and the average length of stay, it was an additional £1166.69 per patient. We then calculated the exact costs incurred by the department, over a three-month period, for all patients, for key investigations or procedures done in the management of choledocholithiasis. This was compared to an estimate cost derived from the recommended pathways in the ASGE guidelines. Overall, 81% (£2048.45) saving was associated with following the guidelines compared to clinical practice. Conclusion: MRCP is the most expensive test associated with the diagnosis and management of choledocholithiasis. The ASGE guidelines recommend endoscopy without an MRCP in patients stratified as high-risk for choledocholithiasis. Our audit that focused on assessing the utility of the ASGE risk scoring system showed it to be relatively reliable for identifying high-risk patients. Our cost analysis has shown significant cost savings per patient and when considering the average length of stay associated with direct endoscopy rather than an additional MRCP. Part of this is also because of an increased average length of stay associated with waiting for an MRCP. The above data supports the ASGE guidelines for the management of high-risk for choledocholithiasis patients from a cost perspective. The only caveat is our small data set that may impact the validity of our average length of hospital stay figures and hence total cost calculations.Keywords: cost-analysis, choledocholithiasis, risk stratification tool, general surgery
Procedia PDF Downloads 984127 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification
Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran
Abstract:
The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM
Procedia PDF Downloads 2524126 Using Derivative Free Method to Improve the Error Estimation of Numerical Quadrature
Authors: Chin-Yun Chen
Abstract:
Numerical integration is an essential tool for deriving different physical quantities in engineering and science. The effectiveness of a numerical integrator depends on different factors, where the crucial one is the error estimation. This work presents an error estimator that combines a derivative free method to improve the performance of verified numerical quadrature.Keywords: numerical quadrature, error estimation, derivative free method, interval computation
Procedia PDF Downloads 4644125 Predicting Subsurface Abnormalities Growth Using Physics-Informed Neural Networks
Authors: Mehrdad Shafiei Dizaji, Hoda Azari
Abstract:
The research explores the pioneering integration of Physics-Informed Neural Networks (PINNs) into the domain of Ground-Penetrating Radar (GPR) data prediction, akin to advancements in medical imaging for tracking tumor progression in the human body. This research presents a detailed development framework for a specialized PINN model proficient at interpreting and forecasting GPR data, much like how medical imaging models predict tumor behavior. By harnessing the synergy between deep learning algorithms and the physical laws governing subsurface structures—or, in medical terms, human tissues—the model effectively embeds the physics of electromagnetic wave propagation into its architecture. This ensures that predictions not only align with fundamental physical principles but also mirror the precision needed in medical diagnostics for detecting and monitoring tumors. The suggested deep learning structure comprises three components: a CNN, a spatial feature channel attention (SFCA) mechanism, and ConvLSTM, along with temporal feature frame attention (TFFA) modules. The attention mechanism computes channel attention and temporal attention weights using self-adaptation, thereby fine-tuning the visual and temporal feature responses to extract the most pertinent and significant visual and temporal features. By integrating physics directly into the neural network, our model has shown enhanced accuracy in forecasting GPR data. This improvement is vital for conducting effective assessments of bridge deck conditions and other evaluations related to civil infrastructure. The use of Physics-Informed Neural Networks (PINNs) has demonstrated the potential to transform the field of Non-Destructive Evaluation (NDE) by enhancing the precision of infrastructure deterioration predictions. Moreover, it offers a deeper insight into the fundamental mechanisms of deterioration, viewed through the prism of physics-based models.Keywords: physics-informed neural networks, deep learning, ground-penetrating radar (GPR), NDE, ConvLSTM, physics, data driven
Procedia PDF Downloads 434124 NanoFrazor Lithography for advanced 2D and 3D Nanodevices
Authors: Zhengming Wu
Abstract:
NanoFrazor lithography systems were developed as a first true alternative or extension to standard mask-less nanolithography methods like electron beam lithography (EBL). In contrast to EBL they are based on thermal scanning probe lithography (t-SPL). Here a heatable ultra-sharp probe tip with an apex of a few nm is used for patterning and simultaneously inspecting complex nanostructures. The heat impact from the probe on a thermal responsive resist generates those high-resolution nanostructures. The patterning depth of each individual pixel can be controlled with better than 1 nm precision using an integrated in-situ metrology method. Furthermore, the inherent imaging capability of the Nanofrazor technology allows for markerless overlay, which has been achieved with sub-5 nm accuracy as well as it supports stitching layout sections together with < 10 nm error. Pattern transfer from such resist features below 10 nm resolution were demonstrated. The technology has proven its value as an enabler of new kinds of ultra-high resolution nanodevices as well as for improving the performance of existing device concepts. The application range for this new nanolithography technique is very broad spanning from ultra-high resolution 2D and 3D patterning to chemical and physical modification of matter at the nanoscale. Nanometer-precise markerless overlay and non-invasiveness to sensitive materials are among the key strengths of the technology. However, while patterning at below 10 nm resolution is achieved, significantly increasing the patterning speed at the expense of resolution is not feasible by using the heated tip alone. Towards this end, an integrated laser write head for direct laser sublimation (DLS) of the thermal resist has been introduced for significantly faster patterning of micrometer to millimeter-scale features. Remarkably, the areas patterned by the tip and the laser are seamlessly stitched together and both processes work on the very same resist material enabling a true mix-and-match process with no developing or any other processing steps in between. The presentation will include examples for (i) high-quality metal contacting of 2D materials, (ii) tuning photonic molecules, (iii) generating nanofluidic devices and (iv) generating spintronic circuits. Some of these applications have been enabled only due to the various unique capabilities of NanoFrazor lithography like the absence of damage from a charged particle beam.Keywords: nanofabrication, grayscale lithography, 2D materials device, nano-optics, photonics, spintronic circuits
Procedia PDF Downloads 724123 Reliability and Maintainability Optimization for Aircraft’s Repairable Components Based on Cost Modeling Approach
Authors: Adel A. Ghobbar
Abstract:
The airline industry is continuously challenging how to safely increase the service life of the aircraft with limited maintenance budgets. Operators are looking for the most qualified maintenance providers of aircraft components, offering the finest customer service. Component owner and maintenance provider is offering an Abacus agreement (Aircraft Component Leasing) to increase the efficiency and productivity of the customer service. To increase the customer service, the current focus on No Fault Found (NFF) units must change into the focus on Early Failure (EF) units. Since the effect of EF units has a significant impact on customer satisfaction, this needs to increase the reliability of EF units at minimal cost, which leads to the goal of this paper. By identifying the reliability of early failure (EF) units with regards to No Fault Found (NFF) units, in particular, the root cause analysis with an integrated cost analysis of EF units with the use of a failure mode analysis tool and a cost model, there will be a set of EF maintenance improvements. The data used for the investigation of the EF units will be obtained from the Pentagon system, an Enterprise Resource Planning (ERP) system used by Fokker Services. The Pentagon system monitors components, which needs to be repaired from Fokker aircraft owners, Abacus exchange pool, and commercial customers. The data will be selected on several criteria’s: time span, failure rate, and cost driver. When the selected data has been acquired, the failure mode and root cause analysis of EF units are initiated. The failure analysis approach tool was implemented, resulting in the proposed failure solution of EF. This will lead to specific EF maintenance improvements, which can be set-up to decrease the EF units and, as a result of this, increasing the reliability. The investigated EFs, between the time period over ten years, showed to have a significant reliability impact of 32% on the total of 23339 unscheduled failures. Since the EFs encloses almost one-third of the entire population.Keywords: supportability, no fault found, FMEA, early failure, availability, operational reliability, predictive model
Procedia PDF Downloads 1294122 Empowering Indigenous Epistemologies in Geothermal Development
Authors: Te Kīpa Kēpa B. Morgan, Oliver W. Mcmillan, Dylan N. Taute, Tumanako N. Fa'aui
Abstract:
Epistemologies are ways of knowing. Indigenous Peoples are aware that they do not perceive and experience the world in the same way as others. So it is important when empowering Indigenous epistemologies, such as that of the New Zealand Māori, to also be able to represent a scientific understanding within the same analysis. A geothermal development assessment tool has been developed by adapting the Mauri Model Decision Making Framework. Mauri is a metric that is capable of representing the change in the life-supporting capacity of things and collections of things. The Mauri Model is a method of grouping mauri indicators as dimension averages in order to allow holistic assessment and also to conduct sensitivity analyses for the effect of worldview bias. R-shiny is the coding platform used for this Vision Mātauranga research which has created an expert decision support tool (DST) that combines a stakeholder assessment of worldview bias with an impact assessment of mauri-based indicators to determine the sustainability of proposed geothermal development. The initial intention was to develop guidelines for quantifying mātauranga Māori impacts related to geothermal resources. To do this, three typical scenarios were considered: a resource owner wishing to assess the potential for new geothermal development; another party wishing to assess the environmental and cultural impacts of the proposed development; an assessment that focuses on the holistic sustainability of the resource, including its surface features. Indicator sets and measurement thresholds were developed that are considered necessary considerations for each assessment context and these have been grouped to represent four mauri dimensions that mirror the four well-being criteria used for resource management in Aotearoa, New Zealand. Two case studies have been conducted to test the DST suitability for quantifying mātauranga Māori and other biophysical factors related to a geothermal system. This involved estimating mauri0meter values for physical features such as temperature, flow rate, frequency, colour, and developing indicators to also quantify qualitative observations about the geothermal system made by Māori. A retrospective analysis has then been conducted to verify different understandings of the geothermal system. The case studies found that the expert DST is useful for geothermal development assessment, especially where hapū (indigenous sub-tribal grouping) are conflicted regarding the benefits and disadvantages of their’ and others’ geothermal developments. These results have been supplemented with evaluations for the cumulative impacts of geothermal developments experienced by different parties using integration techniques applied to the time history curve of the expert DST worldview bias weighted plotted against the mauri0meter score. Cumulative impacts represent the change in resilience or potential of geothermal systems, which directly assists with the holistic interpretation of change from an Indigenous Peoples’ perspective.Keywords: decision support tool, holistic geothermal assessment, indigenous knowledge, mauri model decision-making framework
Procedia PDF Downloads 1874121 Positive Impact of Cartoon Movies on Adults
Authors: Yacoub Aljaffery
Abstract:
As much as we think negatively about social media such as TV and smart phones, there are many positive benefits our society can get from it. Cartoons, for example, are made specifically for children. However, in this paper, we will prove how cartoon videos can have a positive impact on adults, especially college students. Since cartoons are meant to be a good learning tool for children, as well as adults, we will show our audience how they can use cartoon in teaching critical thinking and other language skills.Keywords: social media, TV, teaching, learning, cartoon movies
Procedia PDF Downloads 3244120 The Reasons behind Individuals to Join Terrorist Organizations: Recruitment from Outside
Authors: Murat Sözen
Abstract:
Today terrorism is gaining momentum again. Parallel to this, it hurts more than before because it has victims from not only its own locations but also remote places. As victims are from outside, militants are likewise from own location and outside. What made these individuals join the terrorist organizations and how these organizations recruit militants are still unanswered. The purpose of this work is to find reasons of joining and power of recruiting. In addition, the role of most popular tool of recruiting, ‘social media’ will be examined.Keywords: recruitment, social media, recruitment, militants
Procedia PDF Downloads 3514119 International E-Learning for Assuring Ergonomic Working Conditions of Orthopaedic Surgeons: First Research Outcomes from Train4OrthoMIS
Authors: J. Bartnicka, J. A. Piedrabuena, R. Portilla, L. Moyano - Cuevas, J. B. Pagador, P. Augat, J. Tokarczyk, F. M. Sánchez Margallo
Abstract:
Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation. The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements”. The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.Keywords: international e-learning, ergonomics, orthopaedic surgery, Train4OrthoMIS
Procedia PDF Downloads 1824118 Measuring Fundamental Growth Needs in a Youth Boatbuilding Context
Authors: Shane Theunissen, Rob Grandy
Abstract:
Historically and we would fairly conventionally within our formal schooling systems, we have convergent testing where all the students are expected to converge on the same answer, and that answer has been determined by an external authority that is reproducing knowledge of the hegemon. Many youths may not embody the cultural capital that's rewarded in formal schooling contexts as they aren't able to converge on the required answer that's being determined by the classroom teacher or the administrators. In this paper, we explore divergent processes that promote creative problem-solving. We embody this divergent process in our measurement of fundamental growth needs. To this end, we utilize the Mosaic Approach as a method for implementing the Outcomes That Matter framework. Outcomes That Matter is the name of the measurement tool built around the Circle of Courage framework, which is a way of identifying fundamental growth needs for young people. The Circle of Courage was developed by Martin-Broken-Leg and colleagues as a way to connect indigenous child-rearing philosophies with contemporary resilience and positive psychology research. The Outcomes that Matter framework puts forward four categories of growth needs for young people. These are: Belonging, which on a macro scale is acceptance into the greater community of practice, Mastery which includes a constellation of concepts including confidence, motivation, self-actualization, and self-determination, Independence refers to a sense of personal power into autonomy within a context where creativity and problem solving, and a personal voice can begin to emerge, and finally Generosity which includes interpersonal things like conflict resolution and teamwork. Outcomes of Matter puts these four domains into a measurement tool that facilitates collaborative assessment between the youth, teachers, and recreation therapists that allows for youth-led narratives pertaining to their fundamental growth outcomes. This application of the Outcomes That Matter framework is unique as it may be the first application of this framework in an educational boatbuilding context.Keywords: collaboration, empowerment, outcomes that matter, mosaic approach, boat building
Procedia PDF Downloads 984117 Understanding the First Mental Breakdown from the Families’ Perspective Through Metaphors
Authors: Eli Buchbinder
Abstract:
Introduction. Language is the basis to our experience as human being. We use language in describing our experiences and construct meaning and narratives from experiences. Metaphors are a valuable linguistic tool commonly use. Metaphors link two domains that are ordinarily not related. Metaphors achieve simultaneously multi-level integration: abstract and concrete, rational and imaginative, familiar and the unfamiliar, conscious and preconscious/unconscious. As such, metaphors epistemological and ontological tool that are important in social work in every field and domain. Goals and Methods The presentation’s aim is to validate the value of metaphors through the first psychiatric breakdown is a traumatic for families. The presentation is based on two pooled qualitative studies. The first study focused on 12 spouses: 7 women and 5 men, between the ages of 22 and 57, regarding their experiences and meanings of the first psychiatric hospitalization of their partners diagnosed with affective disorders. The second study focused on 10 parents, between the ages of 47 and 62, regarding their experiences and meanings following their child's first psychotic breakdown during young adulthood. Results Two types of major metaphors evolved from the interviews in farming the trauma of the first mental breakdown. The first mode - orientation (spatial) metaphors, reflect symbolic expression of the loss of a secure base, represented in the physical environment, e.g., describing hospitalization as "falling into an abyss." The second mode- ontological metaphors, reflect how parents and spouses present their traumatic experiences of hospitalization in terms of discrete, powerful and coherent entities, e.g., describing the first hospitalization as "swimming against the tide." The two metaphors modes reflect the embodiment of the unpredictability, being mired in distress, shock, intense pain and the experience the collapse of continuity on the life course and cuts off the experience of control. Conclusions Metaphors are important and powerful guide in assessing individuals and families’ phenomenological reality. As such, metaphors are useful for understanding and orientated therapeutic intervening, in the studies above, with the first psychiatric hospitalization experienced, as well as in others social workers’ interventions.Keywords: first mental breakdown, metaphors, family perspective, qualitative research
Procedia PDF Downloads 744116 Pedagogical Tools In The 21st Century
Authors: M. Aherrahrou
Abstract:
Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning
Procedia PDF Downloads 3554115 Implementing Quality Function Deployment Tool for a Customer Driven New Product Development in a Kuwait SME
Authors: Asma AlQahtani, Jumana AlHadad, Maryam AlQallaf, Shoug AlHasan
Abstract:
New product development (NPD) is the complete process of bringing a new product to the customer by integrating the two broad divisions; one involving the idea generation, product design and detail engineering; and the other involving market research and marketing analysis. It is a common practice for companies to undertake some of these tasks simultaneously (concurrent engineering) and also consider them as an ongoing process (continuous development). The current study explores the framework and methodology for a new product development process utilizing the Quality Function Deployment (QFD) tool for bringing the customer opinion into the product development process. An elaborate customer survey with focus groups in the region was carried out to ensure that customer requirements are integrated into new products as early as the design stage including identifying the recognition of need for the new product. A QFD Matrix (House of Quality) was prepared that links customer requirements to product engineering requirements and a feasibility study and risk assessment exercise was carried out for a Small and Medium Enterprise (SME) in Kuwait for development of the new product. SMEs in Kuwait, particularly in manufacturing sector are mainly focused on serving the local demand, and often lack of product quality adversely affects the ability of the companies to compete on a regional/global basis. Further, lack of focus on identifying customer requirements often deters SMEs to envisage the idea of a New Product Development. The current study therefore focuses in utilizing QFD Matrix right from the conceptual design to detail design and to some extent, extending the link this to design of the manufacturing system. The outcome of the project resulted in a development of the prototype for a new molded product which can ensure consistency between the customer’s requirements and the measurable characteristics of the product. The Engineering Economics and Cost studies were also undertaken to analyse the viability of the new product, the results of which was also linked to the successful implementation of the initial QFD Matrix.Keywords: Quality Function Deployment, QFD Matrix, new product development, NPD, Kuwait SMEs, prototype development
Procedia PDF Downloads 4164114 A Simulated Evaluation of Model Predictive Control
Authors: Ahmed AlNouss, Salim Ahmed
Abstract:
Process control refers to the techniques to control the variables in a process in order to maintain them at their desired values. Advanced process control (APC) is a broad term within the domain of control where it refers to different kinds of process control and control related tools, for example, model predictive control (MPC), statistical process control (SPC), fault detection and classification (FDC) and performance assessment. APC is often used for solving multivariable control problems and model predictive control (MPC) is one of only a few advanced control methods used successfully in industrial control applications. Advanced control is expected to bring many benefits to the plant operation; however, the extent of the benefits is plant specific and the application needs a large investment. This requires an analysis of the expected benefits before the implementation of the control. In a real plant simulation studies are carried out along with some experimentation to determine the improvement in the performance of the plant due to advanced control. In this research, such an exercise is undertaken to realize the needs of APC application. The main objectives of the paper are as follows: (1) To apply MPC to a number of simulations set up to realize the need of MPC by comparing its performance with that of proportional integral derivatives (PID) controllers. (2) To study the effect of controller parameters on control performance. (3) To develop appropriate performance index (PI) to compare the performance of different controller and develop novel idea to present tuning map of a controller. These objectives were achieved by applying PID controller and a special type of MPC which is dynamic matrix control (DMC) on the multi-tanks process simulated in loop-pro. Then the controller performance has been evaluated by changing the controller parameters. This performance was based on special indices related to the difference between set point and process variable in order to compare the both controllers. The same principle was applied for continuous stirred tank heater (CSTH) and continuous stirred tank reactor (CSTR) processes simulated in Matlab. However, in these processes some developed programs were written to evaluate the performance of the PID and MPC controllers. Finally these performance indices along with their controller parameters were plotted using special program called Sigmaplot. As a result, the improvement in the performance of the control loops was quantified using relevant indices to justify the need and importance of advanced process control. Also, it has been approved that, by using appropriate indices, predictive controller can improve the performance of the control loop significantly.Keywords: advanced process control (APC), control loop, model predictive control (MPC), proportional integral derivatives (PID), performance indices (PI)
Procedia PDF Downloads 407