Search results for: Chameleon Hash Function (CHF)
4301 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 6694300 Deepnic, A Method to Transform Each Variable into Image for Deep Learning
Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.
Abstract:
Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.Keywords: tabular data, deep learning, perfect trees, NICS
Procedia PDF Downloads 914299 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities
Authors: Retius Chifurira
Abstract:
Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities
Procedia PDF Downloads 2014298 Cloning and Expression of Azurin: A Protein Having Antitumor and Cell Penetrating Ability
Authors: Mohsina Akhter
Abstract:
Cancer has become a wide spread disease around the globe and takes many lives every year. Different treatments are being practiced but all have potential side effects with somewhat less specificity towards target sites. Pseudomonas aeruginosa is known to secrete a protein azurin with special anti-cancer function. It has unique cell penetrating peptide comprising of 18 amino acids that have ability to enter cancer cells specifically. Reported function of Azurin is to stabilize p53 inside the tumor cells and induces apoptosis through Bax mediated cytochrome c release from mitochondria. At laboratory scale, we have made recombinant azurin through cloning rpTZ57R/T-azu vector into E.coli strain DH-5α and subcloning rpET28-azu vector into E.coli BL21-CodonPlus (DE3). High expression was ensured with IPTG induction at different concentrations then optimized high expression level at 1mM concentration of IPTG for 5 hours. Purification has been done by using Ni+2 affinity chromatography. We have concluded that azurin can be a remarkable improvement in cancer therapeutics if it produces on a large scale. Azurin does not enter into the normal cells so it will prove a safe and secure treatment for patients and prevent them from hazardous anomalies.Keywords: azurin, pseudomonas aeruginosa, cancer, therapeutics
Procedia PDF Downloads 3144297 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots
Authors: Meng Wu
Abstract:
Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization
Procedia PDF Downloads 1384296 Optimizing the Public Policy Information System under the Environment of E-Government
Authors: Qian Zaijian
Abstract:
E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems
Procedia PDF Downloads 8674295 Multiscale Syntheses of Knee Collateral Ligament Stresses: Aggregate Mechanics as a Function of Molecular Properties
Authors: Raouf Mbarki, Fadi Al Khatib, Malek Adouni
Abstract:
Knee collateral ligaments play a significant role in restraining excessive frontal motion (varus/valgus rotations). In this investigation, a multiscale frame was developed based on structural hierarchies of the collateral ligaments starting from the bottom (tropocollagen molecule) to up where the fibred reinforced structure established. Experimental data of failure tensile test were considered as the principal driver of the developed model. This model was calibrated statistically using Bayesian calibration due to the high number of unknown parameters. Then the model is scaled up to fit the real structure of the collateral ligaments and simulated under realistic boundary conditions. Predications have been successful in describing the observed transient response of the collateral ligaments during tensile test under pre- and post-damage loading conditions. Collateral ligaments maximum stresses and strengths were observed near to the femoral insertions, a results that is in good agreement with experimental investigations. Also for the first time, damage initiation and propagation were documented with this model as a function of the cross-link density between tropocollagen molecules.Keywords: multiscale model, tropocollagen, fibrils, ligaments commas
Procedia PDF Downloads 1604294 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor
Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha
Abstract:
The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC
Procedia PDF Downloads 2454293 Similarity Solutions of Nonlinear Stretched Biomagnetic Flow and Heat Transfer with Signum Function and Temperature Power Law Geometries
Authors: M. G. Murtaza, E. E. Tzirtzilakis, M. Ferdows
Abstract:
Biomagnetic fluid dynamics is an interdisciplinary field comprising engineering, medicine, and biology. Bio fluid dynamics is directed towards finding and developing the solutions to some of the human body related diseases and disorders. This article describes the flow and heat transfer of two dimensional, steady, laminar, viscous and incompressible biomagnetic fluid over a non-linear stretching sheet in the presence of magnetic dipole. Our model is consistent with blood fluid namely biomagnetic fluid dynamics (BFD). This model based on the principles of ferrohydrodynamic (FHD). The temperature at the stretching surface is assumed to follow a power law variation, and stretching velocity is assumed to have a nonlinear form with signum function or sign function. The governing boundary layer equations with boundary conditions are simplified to couple higher order equations using usual transformations. Numerical solutions for the governing momentum and energy equations are obtained by efficient numerical techniques based on the common finite difference method with central differencing, on a tridiagonal matrix manipulation and on an iterative procedure. Computations are performed for a wide range of the governing parameters such as magnetic field parameter, power law exponent temperature parameter, and other involved parameters and the effect of these parameters on the velocity and temperature field is presented. It is observed that for different values of the magnetic parameter, the velocity distribution decreases while temperature distribution increases. Besides, the finite difference solutions results for skin-friction coefficient and rate of heat transfer are discussed. This study will have an important bearing on a high targeting efficiency, a high magnetic field is required in the targeted body compartment.Keywords: biomagnetic fluid, FHD, MHD, nonlinear stretching sheet
Procedia PDF Downloads 1624292 Maintenance Performance Measurement Derived Optimization: A Case Study
Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Stanley Mburu
Abstract:
Maintenance performance measurement (MPM) represents an integrated aspect that considers both operational and maintenance related aspects while evaluating the effectiveness and efficiency of maintenance to ensure assets are working as they should. Three salient issues require to be addressed for an asset-intensive organization to employ an MPM-based framework to optimize maintenance. Firstly, the organization should establish important perfomance metric(s), in this case the maintenance objective(s), which they will be focuss on. The second issue entails aligning the maintenance objective(s) with maintenance optimization. This is achieved by deriving maintenance performance indicators that subsequently form an objective function for the optimization program. Lastly, the objective function is employed in an optimization program to derive maintenance decision support. In this study, we develop a framework that initially identifies the crucial maintenance performance measures, and employs them to derive maintenance decision support. The proposed framework is demonstrated in a case study of a geothermal drilling rig, where the objective function is evaluated utilizing a simulation-based model whose parameters are derived from empirical maintenance data. Availability, reliability and maintenance inventory are depicted as essential objectives requiring further attention. A simulation model is developed mimicking a drilling rig operations and maintenance where the sub-systems are modelled undergoing imperfect maintenance, corrective (CM) and preventive (PM), with the total cost as the primary performance measurement. Moreover, three maintenance spare inventory policies are considered; classical (retaining stocks for a contractual period), vendor-managed inventory with consignment stock and periodic monitoring order-to-stock (s, S) policy. Optimization results infer that the adoption of (s, S) inventory policy, increased PM interval and reduced reliance of CM actions offers improved availability and total costs reduction.Keywords: maintenance, vendor-managed, decision support, performance, optimization
Procedia PDF Downloads 1254291 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation
Authors: Lawrence A. Farinola
Abstract:
Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error
Procedia PDF Downloads 1224290 Evaluation of the Impact of Neuropathic Pain on the Quality of Life of Patients
Authors: A. Ibovi Mouondayi, S. Zaher, R. Assadi, K. Erraoui, S. Sboul, J. Daoudim, S. Bousselham, K. Nassar, S. Janani
Abstract:
Introduction: Neuropathic pain (NP) is chronic pain; it can be observed in a large number of clinical situations. This pain results from a lesion of the peripheral or central nervous system. It is a frequent reason for consultations in rheumatology. This pain being chronic, can become disabling for the patient, thereby altering his quality of life. Objective: The objective of this study was to evaluate the impact of neuropathic pain on the quality of life of patients followed-up for chronic neuropathic pain. Material and Method: This is a monocentric, cross-sectional, descriptive, retrospective study conducted in our department over a period of 19 months from October 2020 to April 2022. The missing parameters were collected during phone calls of the patients concerned. The diagnostic tool adopted was the DN4 questionnaire in the dialectal Arabic version. The impact of NP was assessed by the visual analog scale (VAS) on pain, sleep, and function. The impact of PN on mood was assessed by the hospital anxiety, and depression scale (HAD) score in the validated Arabic version. The exclusion criteria were patients followed up for depression and other psychiatric pathologies. Results: A total of 1528 patient data were collected; the average age of the patients was 57 years (standard deviation: 13 years) with extremes ranging from 17 years to 94 years, 91% were women and 9% men with a sex ratio man/woman equal to 0.10. 67% of our patients were married, and 63% of our patients were housewives. 43% of patients were followed-up for degenerative pathology. The NP was cervical radiculopathy in 26%, lumbosacral radiculopathy in 51%, and carpal tunnel syndrome in 20%. 23% of our patients had poor sleep quality, and 54% had average sleep quality. The pain was very intense in 5% of patients; 33% had severe pain, and 58% had moderate pain. The function was limited in 55% of patients. The average HAD score for anxiety and depression was 4.39 (standard deviation: 2.77) and 3.21 (standard deviation: 2.89), respectively. Conclusion: Our data clearly illustrate that neuropathic pain has a negative impact on the quality of sleep and function, as well as the mood of patients, thus influencing their quality of life.Keywords: neuropathic pain, sleep, quality of life, chronic pain
Procedia PDF Downloads 1344289 Systematic Identification and Quantification of Substrate Specificity Determinants in Human Protein Kinases
Authors: Manuel A. Alonso-Tarajano, Roberto Mosca, Patrick Aloy
Abstract:
Protein kinases participate in a myriad of cellular processes of major biomedical interest. The in vivo substrate specificity of these enzymes is a process determined by several factors, and despite several years of research on the topic, is still far from being totally understood. In the present work, we have quantified the contributions to the kinase substrate specificity of i) the phosphorylation sites and their surrounding residues in the sequence and of ii) the association of kinases to adaptor or scaffold proteins. We have used position-specific scoring matrices (PSSMs), to represent the stretches of sequences phosphorylated by 93 families of kinases. We have found negative correlations between the number of sequences from which a PSSM is generated and the statistical significance and the performance of that PSSM. Using a subset of 22 statistically significant PSSMs, we have identified specificity determinant residues (SDRs) for 86% of the corresponding kinase families. Our results suggest that different SDRs can function as positive or negative elements of substrate recognition by the different families of kinases. Additionally, we have found that human proteins with known function as adaptors or scaffolds (kAS) tend to interact with a significantly large fraction of the substrates of the kinases to which they associate. Based on this characteristic we have identified a set of 279 potential adaptors/scaffolds (pAS) for human kinases, which is enriched in Pfam domains and functional terms tightly related to the proposed function. Moreover, our results show that for 74.6% of the kinase– pAS association found, the pAS colocalize with the substrates of the kinases they are associated to. Finally, we have found evidence suggesting that the association of kinases to adaptors and scaffolds, may contribute significantly to diminish the in vivo substrate crossed- specificity of protein kinases. In general, our results indicate the relevance of several SDRs for both the positive and negative selection of phosphorylation sites by kinase families and also suggest that the association of kinases to pAS proteins may be an important factor for the localization of the enzymes with their set of substrates.Keywords: kinase, phosphorylation, substrate specificity, adaptors, scaffolds, cellular colocalization
Procedia PDF Downloads 3444288 The Digital Living Archive and the Construction of a Participatory Cultural Memory in the DARE-UIA Project: Digital Environment for Collaborative Alliances to Regenerate Urban Ecosystems in Middle-Sized Cities
Authors: Giulia Cardoni, Francesca Fabbrii
Abstract:
Living archives perform a function of social memory sharing, which contributes to building social bonds, communities, and identities. This potential lies in the ability to live archives to put together an archival function, which allows the conservation and transmission of memory with an artistic, performative and creative function linked to the present. As part of the DARE-UIA (Digital environment for collaborative alliances to regenerate urban ecosystems in middle-sized cities) project the creation of a living digital archive made it possible to create a narrative that would consolidate the cultural memory of the Darsena district of the city of Ravenna. The aim of the project is to stimulate the urban regeneration of a suburban area of a city, enhancing its cultural memory and identity heritage through digital heritage tools. The methodology used involves various digital storytelling actions necessary for the overall narrative using georeferencing systems (GIS), storymaps and 3D reconstructions for a transversal narration of historical content such as personal and institutional historical photos and to enhance the industrial archeology heritage of the neighborhood. The aim is the creation of an interactive and replicable narrative in similar contexts to the Darsena district in Ravenna. The living archive, in which all the digital contents are inserted, finds its manifestation towards the outside in the form of a museum spread throughout the neighborhood, making the contents usable on smartphones via QR codes and totems inserted on-site, creating thematic itineraries spread around the neighborhood. The construction of an interactive and engaging digital narrative has made it possible to enhance the material and immaterial heritage of the neighborhood by recreating the community that has historically always distinguished it.Keywords: digital living archive, digital storytelling, GIS, 3D, open-air museum, urban regeneration, cultural memory
Procedia PDF Downloads 1074287 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects
Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang
Abstract:
As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.Keywords: 4D, 5D, 6D, active BIM
Procedia PDF Downloads 2784286 Bright, Dark N-Soliton Solution of Fokas-Lenells Equation Using Hirota Bilinearization Method
Authors: Sagardeep Talukdar, Riki Dutta, Gautam Kumar Saharia, Sudipta Nandy
Abstract:
In non-linear optics, the Fokas-Lenells equation (FLE) is a well-known integrable equation that describes how ultrashort pulses move across the optical fiber. It admits localized wave solutions, just like any other integrable equation. We apply the Hirota bilinearization method to obtain the soliton solution of FLE. The proposed bilinearization makes use of an auxiliary function. We apply the method to FLE with a vanishing boundary condition, that is, to obtain a bright soliton solution. We have obtained bright 1-soliton and 2-soliton solutions and propose a scheme for obtaining an N-soliton solution. We have used an additional parameter that is responsible for the shift in the position of the soliton. Further analysis of the 2-soliton solution is done by asymptotic analysis. In the non-vanishing boundary condition, we obtain the dark 1-soliton solution. We discover that the suggested bilinearization approach, which makes use of the auxiliary function, greatly simplifies the process while still producing the desired outcome. We think that the current analysis will be helpful in understanding how FLE is used in nonlinear optics and other areas of physics.Keywords: asymptotic analysis, fokas-lenells equation, hirota bilinearization method, soliton
Procedia PDF Downloads 1134285 Cultural Transformation in Interior Design in Commercial Space in India
Authors: Siddhi Pedamkar, Reenu Singh
Abstract:
This report is based on how a culture transforms from one era to another era in commercial space. This transformation is observed in commercial as well as residential spaces. The spaces have specific color concepts, surface detailing furniture, and function-specific layouts. But the cultural impact is very rarely seen in commercial spaces, mostly because the interior is divine by function to a large extent. Information was collected from books and research papers. A quantitative survey was conducted to understand people's perceptions about the impact of culture on design entities and how culture dictates the different types of space and their character. The survey also highlights the impact of types of interior lighting, colour schemes, and furniture types on the interior environment. The questionnaire survey helped in framing design parameters for contemporary interior design. The design parameters are used to propose design options for new-age furniture that can be used in co-working spaces. For the new and contemporary working spaces, new age design furniture, interior elements such as visual partition, semi-visual partition, lighting, and layout can be transformed by cultural changes in the working style of people and organization.Keywords: commercial space, culture, environment, furniture, interior
Procedia PDF Downloads 1184284 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 2694283 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model
Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis
Abstract:
In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data
Procedia PDF Downloads 3354282 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment
Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane
Abstract:
Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.Keywords: artificial intelligence, computer science, criminal investigation, digital forensics
Procedia PDF Downloads 2134281 Haemobiogram after Intramuscular Administration of Amoxicillin to Sheep
Authors: Amer Elgerwi, Abdelrazzag El-Magdoub, Abubakr El-Mahmoudy
Abstract:
There are many bacterial infections affecting sheep that necessitates antibiotic intervention. Amoxicillin is among commonly used antibiotics in such case for its broad spectrum of activity. However, the side alterations in blood and organ function that may be associated during or after treatment are questionable. Therefore, the aim of the present study was to assess the possible alterations in blood parameters and organ function bio markers of sheep that may occur following intramuscular injection of amoxicillin. Amoxicillin has been administered intramuscularly to 10 sheep at a dosage regimen of 7 mg/kg of body weight for 5 successive days. Two types of blood samples (with and without anticoagulant) were collected from the jugular vein pre- and post-administration of the drug. Amoxicillin significantly (P < 0.001) increased total leukocyte count and (P < 0.05) absolute eosinophilic count when compared with those of the control samples. Aspartate aminotransferase, alkaline phosphatase and cholesterol were significantly (P < 0.05) higher than the corresponding control values. In addition, amoxicillin significantly (P < 0.05) increased blood urea nitrogen and creatinine but decreased phosphorus level when compared with those of prior-administration samples. These data may indicate that although the side changes caused by amoxicillin are minor in sheep, yet the liver and kidney functions should be monitored during its usage in therapy and it should be used with care for treatment of sheep with renal and/or hepatic impairments.Keywords: amoxicillin, biogram, haemogram, sheep
Procedia PDF Downloads 4584280 Application of a Hybrid QFD-FEA Methodology for Nigerian Garment Designs
Authors: Adepeju A. Opaleye, Adekunle Kolawole, Muyiwa A. Opaleye
Abstract:
Consumers’ perceived quality of imported product has been an impediment to business in the Nigeria garment industry. To improve patronage of made- in-Nigeria designs, the first step is to understand what the consumer expects, then proffer ways to meet this expectation through product redesign or improvement of the garment production process. The purpose of this study is to investigate drivers of consumers’ value for typical Nigerian garment design (NGD). An integrated quality function deployment (QFD) and functional, expressive and aesthetic (FEA) Consumer Needs methodology helps to minimize incorrect understanding of potential consumer’s requirements in mass customized garments. Six themes emerged as drivers of consumer’s satisfaction: (1) Style variety (2) Dimensions (3) Finishing (4) Fabric quality (5) Garment Durability and (6) Aesthetics. Existing designs found to lead foreign designs in terms of its acceptance for informal events, style variety and fit. The latter may be linked to its mode of acquisition. A conceptual model of NGD acceptance in the context of consumer’s inherent characteristics, social and the business environment is proposed.Keywords: Perceived quality, Garment design, Quality function deployment, FEA Model , Mass customisation
Procedia PDF Downloads 1374279 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates
Authors: Serge B. Provost
Abstract:
Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.Keywords: density estimation, log-density, polynomial adjustments, sample moments
Procedia PDF Downloads 1654278 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach
Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong
Abstract:
Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach
Procedia PDF Downloads 3964277 Cardiometabolic Risk Factors Responses to Supplemental High Intensity Exercise in Middle School Children
Authors: R. M. Chandler, A. J. Stringer
Abstract:
In adults, short bursts of high-intensity exercise (intensities between 80-95% of maximum heart rates) increase cardiovascular and metabolic function without the time investment of traditional aerobic training. Similar improvements in various health indices are also becoming increasingly evident in children in countries other than the United States. In the United States, physical education programs have become shorter in length and fewer in frequency. With this in the background, it is imperative that health and physical educators delivered well-organized and focused fitness programs that can be tolerated across many different somatotypes. Perhaps the least effective lag-time in a US physical education (PE) class is the first 10 minutes, a time during which children warm up. Replacing a traditional PE warmup with a 10 min high-intensity excise protocol is a time-efficient method to impact health, leaving as much time for other PE material such as skill development, motor behavior development as possible. This supplemented 10 min high-intensity exercise increases cardiovascular function as well as induces favorable body composition changes in as little as six weeks with further enhancement throughout a semester of activity. The supplemental high-intensity exercise did not detract from the PE lesson outcomes.Keywords: cardiovascular fitness, high intensity interval training, high intensity exercise, pediatric
Procedia PDF Downloads 1364276 Discriminant Function Based on Circulating Tumor Cells for Accurate Diagnosis of Metastatic Breast Cancer
Authors: Hatem A. El-Mezayen, Ahmed Abdelmajeed, Fatehya Metwally, Usama Elsaly, Salwa Atef
Abstract:
Tumor metastasis involves the dissemination of malignant cells into the basement membrane and vascular system contributes to the circulating pool of these markers. In this context our aim has been focused on development of a non-invasive. Circulating tumor cells (CTCs) represent a unique liquid biopsy carrying comprehensive biological information of the primary tumor. Herein, we sought to develop a novel score based on the combination of the most significant CTCs biomarkers with and routine laboratory tests for accurate detection of metastatic breast cancer. Methods: Cytokeratin 18 (CK18), Cytokeratin 19 (CK19), and CA15.3 were assayed in metastatic breast cancer (MBC) patients (75), non-MBC patients (50) and healthy control (20). Results: Areas under receiving operating curve (AUCs) were calculated and used for construction on novel score. A novel score named MBC-CTCs = CA15.3 (U/L) × 0.08 + CK 18 % × 2.9 + CK19 × 3.1– 510. That function correctly classified 87% of metastatic breast cancer at cut-off value = 0.55. (i.e great than 0.55 indicates patients with metastatic breast cancer and less than 0.55 indicates patients with non-metastatic breast cancer). Conclusion: MBC-CTCs is a novel, non-invasive and simple can applied to discriminate patients with metastatic breast cancer.Keywords: metastatic breast cancer, circulating tumor cells, cytokeratin, EpiCam
Procedia PDF Downloads 2144275 The Relationship Between Sleep Characteristics and Cognitive Impairment in Patients with Alzheimer’s Disease
Authors: Peng Guo
Abstract:
Objective: This study investigates the clinical characteristics of sleep disorders (SD) in patients with Alzheimer's disease (AD) and their relationship with cognitive impairment. Methods: According to the inclusion and exclusion criteria of AD, 460 AD patients were consecutively included in Beijing Tiantan Hospital from January 2016 to April 2022. Demographic data, including gender, age, age of onset, course of disease, years of education and body mass index, were collected. The Pittsburgh sleep quality index (PSQI) scale was used to evaluate the overall sleep status. AD patients with PSQI ≥7 was divided into AD with SD (AD-SD) group, and those with PSQI < 7 were divided into AD with no SD (AD-nSD) group. The overall cognitive function of AD patients was evaluated by the scales of Mini-mental state examination (MMSE) and Montreal cognitive assessment (MoCA), memory was evaluated by the AVLT-immediate recall, AVLT-delayed recall and CFT-delayed memory scales, the language was evaluated by BNT scale, visuospatial ability was evaluated by CFT-imitation, executive function was evaluated by Stroop-A, Stroop-B and Stroop-C scales, attention was evaluated by TMT-A, TMT-B, and SDMT scales. The correlation between cognitive function and PSQI score in AD-SD group was analyzed. Results: Among the 460 AD patients, 173 cases (37.61%) had SD. There was no significant difference in gender, age, age of onset, course of disease, years of education and body mass index between AD-SD and AD-nSD groups (P>0.05). The factors with significant difference in PSQI scale between AD-SD and AD-nSD groups include sleep quality, sleep latency, sleep duration, sleep efficiency, sleep disturbance, use of sleeping medication and daytime dysfunction (P<0.05). Compared with AD-nSD group, the total scores of MMSE, MoCA, AVLT-immediate recall and CFT-imitation scales in AD-SD group were significantly lower(P<0.01,P<0.01,P<0.01,P<0.05). In AD-SD group, subjective sleep quality was significantly and negatively correlated with the scores of MMSE, MoCA, AVLT-immediate recall and CFT-imitation scales (r=-0.277,P=0.000; r=-0.216,P=0.004; r=-0.253,P=0.001; r=-0.239, P=0.004), daytime dysfunction was significantly and negatively correlated with the score of AVLT-immediate recall scale (r=-0.160,P=0.043). Conclusion The incidence of AD-SD is 37.61%. AD-SD patients have worse subjective sleep quality, longer time to fall asleep, shorter sleep time, lower sleep efficiency, severer nighttime SD, more use of sleep medicine, and severer daytime dysfunction. The overall cognitive function, immediate recall and visuospatial ability of AD-SD patients are significantly impaired and are closely correlated with the decline of subjective sleep quality. The impairment of immediate recall is highly correlated with daytime dysfunction in AD-SD patients.Keywords: Alzheimer's disease, sleep disorders, cognitive impairment, correlation
Procedia PDF Downloads 324274 The Design of Intelligent Passenger Organization System for Metro Stations Based on Anylogic
Authors: Cheng Zeng, Xia Luo
Abstract:
Passenger organization has always been an essential part of China's metro operation and management. Facing the massive passenger flow, stations need to improve their intelligence and automation degree by an appropriate integrated system. Based on the existing integrated supervisory control system (ISCS) and simulation software (Anylogic), this paper designs an intelligent passenger organization system (IPOS) for metro stations. Its primary function includes passenger information acquisition, data processing and computing, visualization management, decision recommendations, and decision response based on interlocking equipment. For this purpose, the logical structure and intelligent algorithms employed are particularly devised. Besides, the structure diagram of information acquisition and application module, the application of Anylogic, the case library's function process are all given by this research. Based on the secondary development of Anylogic and existing technologies like video recognition, the IPOS is supposed to improve the response speed and address capacity in the face of emergent passenger flow of metro stations.Keywords: anylogic software, decision-making support system, intellectualization, ISCS, passenger organization
Procedia PDF Downloads 1764273 Existence and Stability of Periodic Traveling Waves in a Bistable Excitable System
Authors: M. Osman Gani, M. Ferdows, Toshiyuki Ogawa
Abstract:
In this work, we proposed a modified FHN-type reaction-diffusion system for a bistable excitable system by adding a scaled function obtained from a given function. We study the existence and the stability of the periodic traveling waves (or wavetrains) for the FitzHugh-Nagumo (FHN) system and the modified one and compare the results. The stability results of the periodic traveling waves (PTWs) indicate that most of the solutions in the fast family of the PTWs are stable for the FitzHugh-Nagumo equations. The instability occurs only in the waves having smaller periods. However, the smaller period waves are always unstable. The fast family with sufficiently large periods is always stable in FHN model. We find that the oscillation of pulse widths is absent in the standard FHN model. That motivates us to study the PTWs in the proposed FHN-type reaction-diffusion system for the bistable excitable media. A good agreement is found between the solutions of the traveling wave ODEs and the corresponding whole PDE simulation.Keywords: bistable system, Eckhaus bifurcation, excitable media, FitzHugh-Nagumo model, periodic traveling waves
Procedia PDF Downloads 1874272 Implementing Quality Function Deployment Tool for a Customer Driven New Product Development in a Kuwait SME
Authors: Asma AlQahtani, Jumana AlHadad, Maryam AlQallaf, Shoug AlHasan
Abstract:
New product development (NPD) is the complete process of bringing a new product to the customer by integrating the two broad divisions; one involving the idea generation, product design and detail engineering; and the other involving market research and marketing analysis. It is a common practice for companies to undertake some of these tasks simultaneously (concurrent engineering) and also consider them as an ongoing process (continuous development). The current study explores the framework and methodology for a new product development process utilizing the Quality Function Deployment (QFD) tool for bringing the customer opinion into the product development process. An elaborate customer survey with focus groups in the region was carried out to ensure that customer requirements are integrated into new products as early as the design stage including identifying the recognition of need for the new product. A QFD Matrix (House of Quality) was prepared that links customer requirements to product engineering requirements and a feasibility study and risk assessment exercise was carried out for a Small and Medium Enterprise (SME) in Kuwait for development of the new product. SMEs in Kuwait, particularly in manufacturing sector are mainly focused on serving the local demand, and often lack of product quality adversely affects the ability of the companies to compete on a regional/global basis. Further, lack of focus on identifying customer requirements often deters SMEs to envisage the idea of a New Product Development. The current study therefore focuses in utilizing QFD Matrix right from the conceptual design to detail design and to some extent, extending the link this to design of the manufacturing system. The outcome of the project resulted in a development of the prototype for a new molded product which can ensure consistency between the customer’s requirements and the measurable characteristics of the product. The Engineering Economics and Cost studies were also undertaken to analyse the viability of the new product, the results of which was also linked to the successful implementation of the initial QFD Matrix.Keywords: Quality Function Deployment, QFD Matrix, new product development, NPD, Kuwait SMEs, prototype development
Procedia PDF Downloads 416