Search results for: Bayesian approach
13246 Unified Assessment of Power System Reserve-based Reliability Levels
Authors: B. M. Alshammari, M. A. El-Kady
Abstract:
This paper presents a unified framework for assessment of reserve-based reliability levels in electric power systems. The unified approach is based on reserve-based analysis and assessment of the relationship between available generation capacities and required demand levels. The developed approach takes into account the load variations as well as contingencies which occur randomly causing some generation and/or transmission capacities to be lost (become unavailable). The calculated reserve based indices, which are important to assess the reserve capabilities of the power system for various operating scenarios are therefore probabilistic in nature. They reflect the fact that neither the load levels nor the generation or transmission capacities are known with absolute certainty. They are rather subjects to random variations and consequently. The calculated reserve-based reliability indices are all subjects to random variations where only expected values of these indices can be evaluated. This paper presents a unified approach to reserve-based reliability assessment of power systems using various reserve assessment criteria. Practical applications are also presented for demonstration purposes to the Saudi electricity power grid.Keywords: assessment, power system, reserve, reliability
Procedia PDF Downloads 61713245 Commoning as an Approach to Community Planning: An Inquiry into the Role of Urban Local Bodies and Commoners
Authors: Pruthvi Nath Palleti, Sarmada Madhulika Kone
Abstract:
Communities are formed based on the commonalities that exist in a set of individuals and when the group comes together on identifying those commonalities and to achieve their common goals. Thus, community planning with its vision to strengthen the community mostly involves with making or remaking of commons, which results in making or remaking of communities. This paper looks into different practices of planning around the world and tried to establish a link between commoning (the act of exercising the rights over commons by commoners) and participatory approach to community planning.Keywords: commoners, commoning, community, participatory planning, urban local bodies
Procedia PDF Downloads 37413244 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 17613243 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning
Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem
Abstract:
The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.Keywords: connectivism, learning analytics, lifelong learning, social semantic web
Procedia PDF Downloads 21713242 User-Perceived Quality Factors for Certification Model of Web-Based System
Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh
Abstract:
One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system
Procedia PDF Downloads 40613241 Image Processing Approach for Detection of Three-Dimensional Tree-Rings from X-Ray Computed Tomography
Authors: Jorge Martinez-Garcia, Ingrid Stelzner, Joerg Stelzner, Damian Gwerder, Philipp Schuetz
Abstract:
Tree-ring analysis is an important part of the quality assessment and the dating of (archaeological) wood samples. It provides quantitative data about the whole anatomical ring structure, which can be used, for example, to measure the impact of the fluctuating environment on the tree growth, for the dendrochronological analysis of archaeological wooden artefacts and to estimate the wood mechanical properties. Despite advances in computer vision and edge recognition algorithms, detection and counting of annual rings are still limited to 2D datasets and performed in most cases manually, which is a time consuming, tedious task and depends strongly on the operator’s experience. This work presents an image processing approach to detect the whole 3D tree-ring structure directly from X-ray computed tomography imaging data. The approach relies on a modified Canny edge detection algorithm, which captures fully connected tree-ring edges throughout the measured image stack and is validated on X-ray computed tomography data taken from six wood species.Keywords: ring recognition, edge detection, X-ray computed tomography, dendrochronology
Procedia PDF Downloads 22113240 Inclusive Cities Decision Matrix Based on a Multidimensional Approach for Sustainable Smart Cities
Authors: Madhurima S. Waghmare, Shaleen Singhal
Abstract:
The concept of smartness, inclusion, sustainability is multidisciplinary and fuzzy, rooted in economic and social development theories and policies which get reflected in the spatial development of the cities. It is a challenge to convert these concepts from aspirations to transforming actions. There is a dearth of assessment and planning tools to support the city planners and administrators in developing smart, inclusive, and sustainable cities. To address this gap, this study develops an inclusive cities decision matrix based on an exploratory approach and using mixed methods. The matrix is soundly based on a review of multidisciplinary urban sector literature and refined and finalized based on inputs from experts and insights from case studies. The application of the decision matric on the case study cities in India suggests that the contemporary planning tools for cities need to be multidisciplinary and flexible to respond to the unique needs of the diverse contexts. The paper suggests that a multidimensional and inclusive approach to city planning can play an important role in building sustainable smart cities.Keywords: inclusive-cities decision matrix, smart cities in India, city planning tools, sustainable cities
Procedia PDF Downloads 15613239 Reverse Logistics Information Management Using Ontological Approach
Authors: F. Lhafiane, A. Elbyed, M. Bouchoum
Abstract:
Reverse Logistics (RL) Process is considered as complex and dynamic network that involves many stakeholders such as: suppliers, manufactures, warehouse, retails, and costumers, this complexity is inherent in such process due to lack of perfect knowledge or conflicting information. Ontologies, on the other hand, can be considered as an approach to overcome the problem of sharing knowledge and communication among the various reverse logistics partners. In this paper, we propose a semantic representation based on hybrid architecture for building the Ontologies in an ascendant way, this method facilitates the semantic reconciliation between the heterogeneous information systems (ICT) that support reverse logistics Processes and product data.Keywords: Reverse Logistics, information management, heterogeneity, ontologies, semantic web
Procedia PDF Downloads 49213238 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55513237 Analyzing Water Waves in Underground Pumped Storage Reservoirs: A Combined 3D Numerical and Experimental Approach
Authors: Elena Pummer, Holger Schuettrumpf
Abstract:
By today underground pumped storage plants as an outstanding alternative for classical pumped storage plants do not exist. They are needed to ensure the required balance between production and demand of energy. As a short to medium term storage pumped storage plants have been used economically over a long period of time, but their expansion is limited locally. The reasons are in particular the required topography and the extensive human land use. Through the use of underground reservoirs instead of surface lakes expansion options could be increased. Fulfilling the same functions, several hydrodynamic processes result in the specific design of the underground reservoirs and must be implemented in the planning process of such systems. A combined 3D numerical and experimental approach leads to currently unknown results about the occurring wave types and their behavior in dependence of different design and operating criteria. For the 3D numerical simulations, OpenFOAM was used and combined with an experimental approach in the laboratory of the Institute of Hydraulic Engineering and Water Resources Management at RWTH Aachen University, Germany. Using the finite-volume method and an explicit time discretization, a RANS-Simulation (k-ε) has been run. Convergence analyses for different time discretization, different meshes etc. and clear comparisons between both approaches lead to the result, that the numerical and experimental models can be combined and used as hybrid model. Undular bores partly with secondary waves and breaking bores occurred in the underground reservoir. Different water levels and discharges change the global effects, defined as the time-dependent average of the water level as well as the local processes, defined as the single, local hydrodynamic processes (water waves). Design criteria, like branches, directional changes, changes in cross-section or bottom slope, as well as changes in roughness have a great effect on the local processes, the global effects remain unaffected. Design calculations for underground pumped storage plants were developed on the basis of existing formulae and the results of the hybrid approach. Using the design calculations reservoirs heights as well as oscillation periods can be determined and lead to the knowledge of construction and operation possibilities of the plants. Consequently, future plants can be hydraulically optimized applying the design calculations on the local boundary conditions.Keywords: energy storage, experimental approach, hybrid approach, undular and breaking Bores, 3D numerical approach
Procedia PDF Downloads 21313236 Use of SUDOKU Design to Assess the Implications of the Block Size and Testing Order on Efficiency and Precision of Dulce De Leche Preference Estimation
Authors: Jéssica Ferreira Rodrigues, Júlio Silvio De Sousa Bueno Filho, Vanessa Rios De Souza, Ana Carla Marques Pinheiro
Abstract:
This study aimed to evaluate the implications of the block size and testing order on efficiency and precision of preference estimation for Dulce de leche samples. Efficiency was defined as the inverse of the average variance of pairwise comparisons among treatments. Precision was defined as the inverse of the variance of treatment means (or effects) estimates. The experiment was originally designed to test 16 treatments as a series of 8 Sudoku 16x16 designs being 4 randomized independently and 4 others in the reverse order, to yield balance in testing order. Linear mixed models were assigned to the whole experiment with 112 testers and all their grades, as well as their partially balanced subgroups, namely: a) experiment with the four initial EU; b) experiment with EU 5 to 8; c) experiment with EU 9 to 12; and b) experiment with EU 13 to 16. To record responses we used a nine-point hedonic scale, it was assumed a mixed linear model analysis with random tester and treatments effects and with fixed test order effect. Analysis of a cumulative random effects probit link model was very similar, with essentially no different conclusions and for simplicity, we present the results using Gaussian assumption. R-CRAN library lme4 and its function lmer (Fit Linear Mixed-Effects Models) was used for the mixed models and libraries Bayesthresh (default Gaussian threshold function) and ordinal with the function clmm (Cumulative Link Mixed Model) was used to check Bayesian analysis of threshold models and cumulative link probit models. It was noted that the number of samples tested in the same session can influence the acceptance level, underestimating the acceptance. However, proving a large number of samples can help to improve the samples discrimination.Keywords: acceptance, block size, mixed linear model, testing order, testing order
Procedia PDF Downloads 32213235 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 20313234 Knowledge Sharing and Organizational Performance: A System Dynamics Approach
Authors: Shachi Pathak
Abstract:
We are living in knowledge based economy where firms can gain competitive advantage with the help of managing knowledge within the organization. The purpose the study is to develop a conceptual model to explain the relationship between factors affecting knowledge sharing, called as knowledge enablers, in an organization, knowledge sharing activities and organizational performance, using system dynamics approach. This research is important since it will provide better understandings on what are the key knowledge enablers to support knowledge sharing activities, and how knowledge sharing activities will affect the capability of an organization to enhance the performance of the organization.Keywords: knowledge management, knowledge sharing, organizational performance, system dynamics
Procedia PDF Downloads 37613233 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 17613232 Anthraquinone Labelled DNA for Direct Detection and Discrimination of Closely Related DNA Targets
Authors: Sarah A. Goodchild, Rachel Gao, Philip N. Bartlett
Abstract:
A novel detection approach using immobilized DNA probes labeled with Anthraquinone (AQ) as an electrochemically active reporter moiety has been successfully developed as a new, simple, reliable method for the detection of DNA. This method represents a step forward in DNA detection as it can discriminate between multiple nucleotide polymorphisms within target DNA strands without the need for any additional reagents, reporters or processes such as melting of DNA strands. The detection approach utilizes single-stranded DNA probes immobilized on gold surfaces labeled at the distal terminus with AQ. The effective immobilization has been monitored using techniques such as AC impedance and Raman spectroscopy. Simple voltammetry techniques (Differential Pulse Voltammetry, Cyclic Voltammetry) are then used to monitor the reduction potential of the AQ before and after the addition of complementary strand of target DNA. A reliable relationship between the shift in reduction potential and the number of base pair mismatch has been established and can be used to discriminate between DNA from highly related pathogenic organisms of clinical importance. This indicates that this approach may have great potential to be exploited within biosensor kits for detection and diagnosis of pathogenic organisms in Point of Care devices.Keywords: Anthraquinone, discrimination, DNA detection, electrochemical biosensor
Procedia PDF Downloads 39413231 Adaptive Multiple Transforms Hardware Architecture for Versatile Video Coding
Authors: T. Damak, S. Houidi, M. A. Ben Ayed, N. Masmoudi
Abstract:
The Versatile Video Coding standard (VVC) is actually under development by the Joint Video Exploration Team (or JVET). An Adaptive Multiple Transforms (AMT) approach was announced. It is based on different transform modules that provided an efficient coding. However, the AMT solution raises several issues especially regarding the complexity of the selected set of transforms. This can be an important issue, particularly for a future industrial adoption. This paper proposed an efficient hardware implementation of the most used transform in AMT approach: the DCT II. The developed circuit is adapted to different block sizes and can reach a minimum frequency of 192 MHz allowing an optimized execution time.Keywords: adaptive multiple transforms, AMT, DCT II, hardware, transform, versatile video coding, VVC
Procedia PDF Downloads 14713230 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 19613229 Adaptive Control Approach for an Unmanned Aerial Manipulator
Authors: Samah Riache, Madjid Kidouche
Abstract:
In this paper, we propose a nonlinear controller for Aerial Manipulator (AM) consists of a Quadrotor equipped with two degrees of freedom robotic arm. The kinematic and dynamic models were developed by considering the aerial manipulator as a coupled system. The proposed controller was designed using Nonsingular Terminal Sliding Mode Control. The objective of our approach is to improve performances and attenuate the chattering drawback using an adaptive algorithm in the discontinuous control part. Simulation results prove the effectiveness of the proposed control strategy compared with Sliding Mode Controller.Keywords: adaptive algorithm, quadrotor, robotic arm, sliding mode control
Procedia PDF Downloads 18613228 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: transportation networks, freight delivery, data flow, monitoring, e-services
Procedia PDF Downloads 12913227 Human Action Retrieval System Using Features Weight Updating Based Relevance Feedback Approach
Authors: Munaf Rashid
Abstract:
For content-based human action retrieval systems, search accuracy is often inferior because of the following two reasons 1) global information pertaining to videos is totally ignored, only low level motion descriptors are considered as a significant feature to match the similarity between query and database videos, and 2) the semantic gap between the high level user concept and low level visual features. Hence, in this paper, we propose a method that will address these two issues and in doing so, this paper contributes in two ways. Firstly, we introduce a method that uses both global and local information in one framework for an action retrieval task. Secondly, to minimize the semantic gap, a user concept is involved by incorporating features weight updating (FWU) Relevance Feedback (RF) approach. We use statistical characteristics to dynamically update weights of the feature descriptors so that after every RF iteration feature space is modified accordingly. For testing and validation purpose two human action recognition datasets have been utilized, namely Weizmann and UCF. Results show that even with a number of visual challenges the proposed approach performs well.Keywords: relevance feedback (RF), action retrieval, semantic gap, feature descriptor, codebook
Procedia PDF Downloads 47513226 An Approach to Improve Pre University Students' Responsible Environmental Behaviour through Science Writing Heuristic in Malaysia
Authors: Sheila Shamuganathan, Mageswary Karpudewan
Abstract:
This study investigated the effectiveness of green chemistry integrated with Science Writing Heuristic (SWH) in enhancing matriculation students’ responsible environmental behaviour. For this purpose 207 matriculation students were randomly assigned into experimental (N=118) and control (N=89) group. For the experimental group the chemistry concepts were taught using the instructional approach of green chemistry integrated with Science Writing Heuristic (SWH) while for the control group the same content was taught using green chemistry. The data was analysed using ANCOVA and findings obtained from the quantitative analysis reveals that there is significant changes in responsible environmental behaviour (F 1,204) = 32.13 (ηp² = 0.14) which favours the experimental group. The responses of the qualitative data obtained from an interview with the experimental group also further strengthen and indicated a significant improvement in responsible environmental behaviour. The outcome of the study suggests that using green chemistry integrated with Science Writing Heuristic (SWH) could be an alternative approach to improve students’ responsible environmental behaviour towards the environment.Keywords: science writing heuristic, green chemistry, pro environmental behaviour, laboratory
Procedia PDF Downloads 31913225 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 8113224 A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry
Authors: Zeynep Sener, Mehtap Dursun
Abstract:
Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision methodology.Keywords: fuzzy decision making, fuzzy multiple objective programming, medical supply chain, supplier selection
Procedia PDF Downloads 45413223 Sustainable Manufacturing Industries and Energy-Water Nexus Approach
Authors: Shahbaz Abbas, Lin Han Chiang Hsieh
Abstract:
The significant population growth and climate change issues have contributed to the natural resources depletion and their sustainability in the future. Manufacturing industries have a substantial impact on every country’s economy, but the sustainability of the industrial resources is challenging, and the policymakers have been developing the possible solutions to manage the sustainability of industrial resources such as raw material, energy, water, and industrial supply chain. In order to address these challenges, nexus approach is one of the optimization and modelling techniques in the recent sustainable environmental research. The interactions between the nexus components acknowledge that all components are dependent upon each other, and they are interrelated; therefore, their sustainability is also associated with each other. In addition, the nexus concept does not only provide the resources sustainability but also environmental sustainability can be achieved through nexus approach by utilizing the industrial waste as a resource for the industrial processes. Based on energy-water nexus, this study has developed a resource-energy-water for the sugar industry to understand the interactions between sugarcane, energy, and water towards the sustainable sugar industry. In particular, the focus of the research is the Taiwanese sugar industry; however, the same approach can be adapted worldwide to optimize the sustainability of sugar industries. It has been concluded that there are significant interactions between sugarcane, energy consumption, and water consumption in the sugar industry to manage the scarcity of resources in the future. The interactions between sugarcane and energy also deliver a mechanism to reuse the sugar industrial waste as a source of energy, consequently validating industrial and environmental sustainability. The desired outcomes from the nexus can be achieved with the modifications in the policy and regulations of Taiwanese industrial sector.Keywords: energy-water nexus, environmental sustainability, industrial sustainability, natural resource management
Procedia PDF Downloads 12513222 Integrated Modeling Approach for Energy Planning and Climate Change Mitigation Assessment in the State of Florida
Authors: K. Thakkar, C. Ghenai
Abstract:
An integrated modeling approach was used in this study to (1) track energy consumption, production, and resource extraction, (2) track greenhouse gases emissions and (3) analyze emissions for local and regional air pollutions. The model was used in this study for short and long term energy and GHG emissions reduction analysis for the state of Florida. The integrated modeling methodology will help to evaluate the alternative energy scenarios and examine emissions-reduction strategies. The mitigation scenarios have been designed to describe the future energy strategies. They consist of various demand and supply side scenarios. One of the GHG mitigation scenarios is crafted by taking into account the available renewable resources potential for power generation in the state of Florida to compare and analyze the GHG reduction measure against ‘Business As Usual’ and ‘Florida State Policy’ scenario. Two more ‘integrated’ scenarios, (‘Electrification’ and ‘Efficiency and Lifestyle’) are crafted through combination of various mitigation scenarios to assess the cumulative impact of the reduction measures such as technological changes and energy efficiency and conservation.Keywords: energy planning, climate change mitigation assessment, integrated modeling approach, energy alternatives, and GHG emission reductions
Procedia PDF Downloads 44313221 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images
Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj
Abstract:
Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.Keywords: image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization
Procedia PDF Downloads 13513220 Sensitivity Analysis during the Optimization Process Using Genetic Algorithms
Authors: M. A. Rubio, A. Urquia
Abstract:
Genetic algorithms (GA) are applied to the solution of high-dimensional optimization problems. Additionally, sensitivity analysis (SA) is usually carried out to determine the effect on optimal solutions of changes in parameter values of the objective function. These two analyses (i.e., optimization and sensitivity analysis) are computationally intensive when applied to high-dimensional functions. The approach presented in this paper consists in performing the SA during the GA execution, by statistically analyzing the data obtained of running the GA. The advantage is that in this case SA does not involve making additional evaluations of the objective function and, consequently, this proposed approach requires less computational effort than conducting optimization and SA in two consecutive steps.Keywords: optimization, sensitivity, genetic algorithms, model calibration
Procedia PDF Downloads 43713219 A Metaheuristic for the Layout and Scheduling Problem in a Job Shop Environment
Authors: Hernández Eva Selene, Reyna Mary Carmen, Rivera Héctor, Barragán Irving
Abstract:
We propose an approach that jointly addresses the layout of a facility and the scheduling of a sequence of jobs. In real production, these two problems are interrelated. However, they are treated separately in the literature. Our approach is an extension of the job shop problem with transportation delay, where the location of the machines is selected among possible sites. The model minimizes the makespan, using the short processing times rule with two algorithms; the first one considers all the permutations for the location of machines, and the second only a heuristic to select some specific permutations that reduces computational time. Some instances are proved and compared with literature.Keywords: layout problem, job shop scheduling problem, concurrent scheduling and layout problem, metaheuristic
Procedia PDF Downloads 61013218 Impact of Curvatures in the Dike Line on Wave Run-up and Wave Overtopping, ConDike-Project
Authors: Malte Schilling, Mahmoud M. Rabah, Sven Liebisch
Abstract:
Wave run-up and overtopping are the relevant parameters for the dimensioning of the crest height of dikes. Various experimental as well as numerical studies have investigated these parameters under different boundary conditions (e.g. wave conditions, structure type). Particularly for the dike design in Europe, a common approach is formulated where wave and structure properties are parameterized. However, this approach assumes equal run-up heights and overtopping discharges along the longitudinal axis. However, convex dikes have a heterogeneous crest by definition. Hence, local differences in a convex dike line are expected to cause wave-structure interactions different to a straight dike. This study aims to assess both run-up and overtopping at convexly curved dikes. To cast light on the relevance of curved dikes for the design approach mentioned above, physical model tests were conducted in a 3D wave basin of the Ludwig-Franzius-Institute Hannover. A dike of a slope of 1:6 (height over length) was tested under both regular waves and TMA wave spectra. Significant wave heights ranged from 7 to 10 cm and peak periods from 1.06 to 1.79 s. Both run-up and overtopping was assessed behind the curved and straight sections of the dike. Both measurements were compared to a dike with a straight line. It was observed that convex curvatures in the longitudinal dike line cause a redirection of incident waves leading to a concentration around the center point. Measurements prove that both run-up heights and overtopping rates are higher than on the straight dike. It can be concluded that deviations from a straight longitudinal dike line have an impact on design parameters and imply uncertainties within the design approach in force. Therefore, it is recommended to consider these influencing factors for such cases.Keywords: convex dike, longitudinal curvature, overtopping, run-up
Procedia PDF Downloads 29313217 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis
Authors: Alexander A. Tokmakov
Abstract:
Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins
Procedia PDF Downloads 419