Search results for: form function
9402 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 1739401 Back Extraction and Isolation of Alkaloids from Ionic Liquid-Based Extracts
Authors: Rozalina Keremedchieva, Ivan Svinyarov, Milen G. Bogdanov
Abstract:
In continuation of a research project on the application of ionic liquids (ILs) as an alternative to the conventional organic solvents used in the recovery of value added chemicals of industrial interest1-3 we developed a procedure for back extraction and isolation in pure form of the biologically active alkaloid glaucine from IL-based aqueous solutions. One of the approaches applied was the formation of two-phase systems (IL-ATPS) by the addition of kosmotropic salts to the plant extract. The ability of the salts (Na2CO3, MgSO4, (NH4)2SO4, NaH2PO4) to induce the formation of two-phase systems and the influence of pH value on the partition coefficients of glaucine was comprehensively studied. As a result, it was found that the target alkaloid is preferably partitioned into the IL-rich phase regardless of the pH value of the medium and thus shows the inapplicability of the approach used for the isolation of the target compound from the ionic liquid. However, the results obtained can be used as a platform for the development of an analytical method for the quantitative determination of low concentrations of glaucine in biological samples. We further examined the ability of a series of organic solvents such as diethyl ether, Tert-butylmethyl ether, ethyl acetate, butyl acetate, toluene, chloroform, dichloromethane to recover glaucine form raw IL-based aqueous extracts. Optimal conditions for quantitative extraction of glaucine into chloroform were found from which, after removal of the solvent and subsequent recrystallization from ethanol, the target compound was isolated in a high purity as a hydrobromide salt – The form in which it entrance as an active ingredient in various medicines.Keywords: natural products, ionic liquids, solid-liquid extraction, liquid-liquid extraction
Procedia PDF Downloads 4769400 Modelling and Simulation Efforts in Scale-Up and Characterization of Semi-Solid Dosage Forms
Authors: Saurav S. Rath, Birendra K. David
Abstract:
Generic pharmaceutical industry has to operate in strict timelines of product development and scale-up from lab to plant. Hence, detailed product & process understanding and implementation of appropriate mechanistic modelling and Quality-by-design (QbD) approaches are imperative in the product life cycle. This work provides example cases of such efforts in topical dosage products. Topical products are typically in the form of emulsions, gels, thick suspensions or even simple solutions. The efficacy of such products is determined by characteristics like rheology and morphology. Defining, and scaling up the right manufacturing process with a given set of ingredients, to achieve the right product characteristics presents as a challenge to the process engineer. For example, the non-Newtonian rheology varies not only with CPPs and CMAs but also is an implicit function of globule size (CQA). Hence, this calls for various mechanistic models, to help predict the product behaviour. This paper focusses on such models obtained from computational fluid dynamics (CFD) coupled with population balance modelling (PBM) and constitutive models (like shear, energy density). In a special case of the use of high shear homogenisers (HSHs) for the manufacture of thick emulsions/gels, this work presents some findings on (i) scale-up algorithm for HSH using shear strain, a novel scale-up parameter for estimating mixing parameters, (ii) non-linear relationship between viscosity and shear imparted into the system, (iii) effect of hold time on rheology of product. Specific examples of how this approach enabled scale-up across 1L, 10L, 200L, 500L and 1000L scales will be discussed.Keywords: computational fluid dynamics, morphology, quality-by-design, rheology
Procedia PDF Downloads 2689399 Improvement of Process Competitiveness Using Intelligent Reference Models
Authors: Julio Macedo
Abstract:
Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics
Procedia PDF Downloads 869398 Comparative Analysis of Islamic Bank in Indonesia and Malaysia with Risk Profile, Good Corporate Governance, Earnings, and Capital Method: Performance of Business Function and Social Function Perspective
Authors: Achsania Hendratmi, Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum
Abstract:
This study aims to compare and see the differences between Islamic bank in Indonesia and Islamic bank in Malaysia using RGEC method (Risk Profile, Good Corporate Governance, Earnings, and Capital). This study examines the comparison in business and social performance of eleven Islamic banks in Indonesia and fifteen Islamic banks in Malaysia. This research used quantitative approach and the collections of data was done by collecting all the annual reports of banks that has been created as a sample over the period 2011-2015. The test result of the Independent Samples T-test and Mann-Whitney Test showed there were differences in the business performance of Islamic Bank in Indonesia and Malaysia as seen from the aspect of Risk profile (FDR), GCG, and Earnings (ROA). Also, there were differences of business and social performance as seen from Earnings (ROE), Capital (CAR), and Sharia Conformity Indicator (PSR and ZR) aspects.Keywords: business performance, Islamic banks, RGEC, social performance
Procedia PDF Downloads 2939397 Parameters of Main Stage of Discharge between Artificial Charged Aerosol Cloud and Ground in Presence of Model Hydrometeor Arrays
Authors: D. S. Zhuravkova, A. G. Temnikov, O. S. Belova, L. L. Chernensky, T. K. Gerastenok, I. Y. Kalugina, N. Y. Lysov, A.V. Orlov
Abstract:
Investigation of the discharges from the artificial charged water aerosol clouds in presence of the arrays of the model hydrometeors could help to receive the new data about the peculiarities of the return stroke formation between the thundercloud and the ground when the large volumes of the hail particles participate in the lightning discharge initiation and propagation stimulation. Artificial charged water aerosol clouds of the negative or positive polarity with the potential up to one million volts have been used. Hail has been simulated by the group of the conductive model hydrometeors of the different form. Parameters of the impulse current of the main stage of the discharge between the artificial positively and negatively charged water aerosol clouds and the ground in presence of the model hydrometeors array and of its corresponding electromagnetic radiation have been determined. It was established that the parameters of the array of the model hydrometeors influence on the parameters of the main stage of the discharge between the artificial thundercloud cell and the ground. The maximal values of the main stage current impulse parameters and the electromagnetic radiation registered by the plate antennas have been found for the array of the model hydrometeors of the cylinder revolution form for the negatively charged aerosol cloud and for the array of the hydrometeors of the plate rhombus form for the positively charged aerosol cloud, correspondingly. It was found that parameters of the main stage of the discharge between the artificial charged water aerosol cloud and the ground in presence of the model hydrometeor array of the different considered forms depend on the polarity of the artificial charged aerosol cloud. In average, for all forms of the investigated model hydrometeors arrays, the values of the amplitude and the current rise of the main stage impulse current and the amplitude of the corresponding electromagnetic radiation for the artificial charged aerosol cloud of the positive polarity were in 1.1-1.9 times higher than for the charged aerosol cloud of the negative polarity. Thus, the received results could indicate to the possible more important role of the big volumes of the large hail arrays in the thundercloud on the parameters of the return stroke for the positive lightning.Keywords: main stage of discharge, hydrometeor form, lightning parameters, negative and positive artificial charged aerosol cloud
Procedia PDF Downloads 2559396 Intensity-Enhanced Super-Resolution Amplitude Apodization Effect on the Non-Spherical Near-Field Particle-Lenses
Authors: Liyang Yue, Bing Yan, James N. Monks, Rakesh Dhama, Zengbo Wang, Oleg V. Minin, Igor V. Minin
Abstract:
A particle can function as a refractive lens to focus a plane wave, generating a narrow, high intensive, weak-diverging beam within a sub-wavelength volume, known as the ‘photonic jet’. Refractive index contrast (particle to background media) and scaling effect of the dielectric particle (relative-to-wavelength size) play key roles in photonic jet formation, rather than the shape of particle-lens. Waist (full width of half maximum, FWHM) of a photonic jet could be beyond the diffraction limit and smaller than the Airy disk, which defines the minimum distance between two objects to be imaged as two instead of one. Many important applications for imaging and sensing have been afforded based upon the super-resolution characteristic of the photonic jet. It is known that apodization method, in the form of an amplitude pupil-mask centrally situated on a particle-lens, can further reduce the waist of a photonic nanojet, however, usually lower its intensity at the focus due to blocking of the incident light. In this paper, the anomalously intensity-enhanced apodization effect was discovered in the near-field via numerical simulation. It was also experimentally verified by a scale model using a copper-masked Teflon cuboid solid immersion lens (SIL) with 22 mm side length under radiation of a plane wave with 8 mm wavelength. Peak intensity enhancement and the lateral resolution of the produced photonic jet increased by about 36.0 % and 36.4 % in this approach, respectively. This phenomenon may possess the scale effect and would be valid in multiple frequency bands.Keywords: apodization, particle-lens, scattering, near-field optics
Procedia PDF Downloads 1879395 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 6659394 Deepnic, A Method to Transform Each Variable into Image for Deep Learning
Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.
Abstract:
Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.Keywords: tabular data, deep learning, perfect trees, NICS
Procedia PDF Downloads 899393 Graphic Narratives: Representations of Refugeehood in the Form of Illustration
Authors: Pauline Blanchet
Abstract:
In a world where images are a prominent part of our daily lives and a way of absorbing information, the analysis of the representation of migration narratives is vital. This thesis raises questions concerning the power of illustrations, drawings and visual culture in order to represent the migration narratives in the age of Instagram. The rise of graphic novels and comics has come about in the last fifteen years, specifically regarding contemporary authors engaging with complex social issues such as migration and refugeehood. Due to this, refugee subjects are often in these narratives, whether they are autobiographical stories or whether the subject is included in the creative process. Growth in discourse around migration has been present in other art forms; in 2018, there has been dedicated exhibitions around migration such as Tania Bruguera at the TATE (2018-2019), ‘Journeys Drawn’ at the House of Illustration (2018-2019) and dedicated film festivals (2018; the Migration Film Festival), which have shown the recent considerations of using the arts as a medium of expression regarding themes of refugeehood and migration. Graphic visuals are fast becoming a key instrument when representing migration, and the central thesis of this paper is to show the strength and limitations of this form as well the methodology used by the actors in the production process. Recent works which have been released in the last ten years have not being analysed in the same context as previous graphic novels such as Palestine and Persepolis. While a lot of research has been done on the mass media portrayals of refugees in photography and journalism, there is a lack of literature on the representation with illustrations. There is little research about the accessibility of graphic novels such as where they can be found and what the intentions are when writing the novels. It is interesting to see why these authors, NGOs, and curators have decided to highlight these migrant narratives in a time when the mainstream media has done extensive coverage on the ‘refugee crisis’. Using primary data by doing one on one interviews with artists, curators, and NGOs, this paper investigates the efficiency of graphic novels for depicting refugee stories as a viable alternative to other mass medium forms. The paper has been divided into two distinct sections. The first part is concerned with the form of the comic itself and how it either limits or strengthens the representation of migrant narratives. This will involve analysing the layered and complex forms that comics allow such as multimedia pieces, use of photography and forms of symbolism. It will also show how the illustration allows for anonymity of refugees, the empathetic aspect of the form and how the history of the graphic novel form has allowed space for positive representations of women in the last decade. The second section will analyse the creative and methodological process which takes place by the actors and their involvement with the production of the works.Keywords: graphic novel, refugee, communication, media, migration
Procedia PDF Downloads 1129392 Perceptions and Spatial Realities: Women and the City of Limassol
Authors: Anna Papadopoulou
Abstract:
Women’s relationship to the post-industrial city has been defined by a reciprocal relationship between women’s identity and urban form. Women’s place within the social structure has been influenced by often limiting conditions set by the built environment, and, concurrently, women’s active role in social processes has definitively impacted urban development. Cities in Cyprus present unique locations for urban investigations pertaining to gender because of the country’s particular urban history: unlike most prominent European cities that have experienced approximately five hundred years of urban growth spurred by industrial development, Cypriot cities did not begin to form until the end of the Ottoman occupation that occurred in the last quarter of the nineteenth century. Consequently, Cyprus’ urban history is distinctive in that it coincides with international awakenings towards gender equality. This paper is drawn from a study of a contemporary urban narrative of Limassolian women and aims to elucidate spatial and perceptual boundaries that are inherent, constructed and implied. Within the context of this study, gender - in its socially constructed form - becomes a tool for reading and understanding the urban landscape, as well as a vehicle to impact the production and consumption of space. The investigation evaluates urban changes through the lens of women’s entry into the workforce which is a profound event in the social process and consequently explores issues of space and time, connectivity, and access, perceptions and awareness. A narrative of gendered urbanism has been derived from semi-structured interviews where the findings are studied, organised, analysed and synthesised through a grounded theory approach. These qualitative findings have been complemented and specialised by a series of informal observations and mappings.Keywords: boundaries, gender, Limassol, urbanism
Procedia PDF Downloads 2359391 Holographic Visualisation of 3D Point Clouds in Real-time Measurements: A Proof of Concept Study
Authors: Henrique Fernandes, Sofia Catalucci, Richard Leach, Kapil Sugand
Abstract:
Background: Holograms are 3D images formed by the interference of light beams from a laser or other coherent light source. Pepper’s ghost is a form of hologram conceptualised in the 18th century. This Holographic visualisation with metrology measuring techniques by displaying measurements taken in real-time in holographic form can assist in research and education. New structural designs such as the Plexiglass Stand and the Hologram Box can optimise the holographic experience. Method: The equipment used included: (i) Zeiss’s ATOS Core 300 optical coordinate measuring instrument that scanned real-world objects; (ii) Cloud Compare, open-source software used for point cloud processing; and (iii) Hologram Box, designed and manufactured during this research to provide the blackout environment needed to display 3D point clouds in real-time measurements in holographic format, in addition to a portability aspect to holograms. The equipment was tailored to realise the goal of displaying measurements in an innovative technique and to improve on conventional methods. Three test scans were completed before doing a holographic conversion. Results: The outcome was a precise recreation of the original object in the holographic form presented with dense point clouds and surface density features in a colour map. Conclusion: This work establishes a way to visualise data in a point cloud system. To our understanding, this is a work that has never been attempted. This achievement provides an advancement in holographic visualisation. The Hologram Box could be used as a feedback tool for measurement quality control and verification in future smart factories.Keywords: holography, 3D scans, hologram box, metrology, point cloud
Procedia PDF Downloads 889390 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities
Authors: Retius Chifurira
Abstract:
Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities
Procedia PDF Downloads 1989389 Cloning and Expression of Azurin: A Protein Having Antitumor and Cell Penetrating Ability
Authors: Mohsina Akhter
Abstract:
Cancer has become a wide spread disease around the globe and takes many lives every year. Different treatments are being practiced but all have potential side effects with somewhat less specificity towards target sites. Pseudomonas aeruginosa is known to secrete a protein azurin with special anti-cancer function. It has unique cell penetrating peptide comprising of 18 amino acids that have ability to enter cancer cells specifically. Reported function of Azurin is to stabilize p53 inside the tumor cells and induces apoptosis through Bax mediated cytochrome c release from mitochondria. At laboratory scale, we have made recombinant azurin through cloning rpTZ57R/T-azu vector into E.coli strain DH-5α and subcloning rpET28-azu vector into E.coli BL21-CodonPlus (DE3). High expression was ensured with IPTG induction at different concentrations then optimized high expression level at 1mM concentration of IPTG for 5 hours. Purification has been done by using Ni+2 affinity chromatography. We have concluded that azurin can be a remarkable improvement in cancer therapeutics if it produces on a large scale. Azurin does not enter into the normal cells so it will prove a safe and secure treatment for patients and prevent them from hazardous anomalies.Keywords: azurin, pseudomonas aeruginosa, cancer, therapeutics
Procedia PDF Downloads 3109388 The Synthesis, Structure and Catalytic Activity of Iron(II) Complex with New N2O2 Donor Schiff Base Ligand
Authors: Neslihan Beyazit, Sahin Bayraktar, Cahit Demetgul
Abstract:
Transition metal ions have an important role in biochemistry and biomimetic systems and may provide the basis of models for active sites of biological targets. The presence of copper(II), iron(II) and zinc(II) is crucial in many biological processes. Tetradentate N2O2 donor Schiff base ligands are well known to form stable transition metal complexes and these complexes have also applications in clinical and analytical fields. In this study, we present salient structural features and the details of cathecholase activity of Fe(II) complex of a new Schiff Base ligand. A new asymmetrical N2O2 donor Schiff base ligand and its Fe(II) complex were synthesized by condensation of 4-nitro-1,2 phenylenediamine with 6-formyl-7-hydroxy-5-methoxy-2-methylbenzopyran-4-one and by using an appropriate Fe(II) salt, respectively. Schiff base ligand and its metal complex were characterized by using FT-IR, 1H NMR, 13C NMR, UV-Vis, elemental analysis and magnetic susceptibility. In order to determine the kinetics parameters of catechol oxidase-like activity of Schiff base Fe(II) complex, the oxidation of the 3,5-di-tert-butylcatechol (3,5-DTBC) was measured at 25°C by monitoring the increase of the absorption band at 390-400 nm of the product 3,5-di-tert-butylcatequinone (3,5-DTBQ). The compatibility of catalytic reaction with Michaelis-Menten kinetics also investigated by the method of initial rates by monitoring the growth of the 390–400 nm band of 3,5-DTBQ as a function of time. Kinetic studies showed that Fe(II) complex of the new N2O2 donor Schiff base ligand was capable of acting as a model compound for simulating the catecholase properties of type-3 copper proteins.Keywords: catecholase activity, Michaelis-Menten kinetics, Schiff base, transition metals
Procedia PDF Downloads 3929387 Ultra-High Precision Diamond Turning of Infrared Lenses
Authors: Khaled Abou-El-Hossein
Abstract:
The presentation will address the features of two IR convex lenses that have been manufactured using an ultra-high precision machining centre based on single-point diamond turning. The lenses are made from silicon and germanium with a radius of curvature of 500 mm. Because of the brittle nature of silicon and germanium, machining parameters were selected in such a way that ductile regime was achieved. The cutting speed was 800 rpm while the feed rate and depth cut were 20 mm/min and 20 um, respectively. Although both materials comprise a mono-crystalline microstructure and are quite similar in terms of optical properties, machining of silicon was accompanied with more difficulties in terms of form accuracy compared to germanium machining. The P-V error of the silicon profile was 0.222 um while it was only 0.055 um for the germanium lens. This could be attributed to the accelerated wear that takes place on the tool edge when turning mono-crystalline silicon. Currently, we are using other ranges of the machining parameters in order to determine their optimal range that could yield satisfactory performance in terms of form accuracy when fabricating silicon lenses.Keywords: diamond turning, optical surfaces, precision machining, surface roughness
Procedia PDF Downloads 3159386 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots
Authors: Meng Wu
Abstract:
Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization
Procedia PDF Downloads 1369385 Optimizing the Public Policy Information System under the Environment of E-Government
Authors: Qian Zaijian
Abstract:
E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems
Procedia PDF Downloads 8629384 Multiscale Syntheses of Knee Collateral Ligament Stresses: Aggregate Mechanics as a Function of Molecular Properties
Authors: Raouf Mbarki, Fadi Al Khatib, Malek Adouni
Abstract:
Knee collateral ligaments play a significant role in restraining excessive frontal motion (varus/valgus rotations). In this investigation, a multiscale frame was developed based on structural hierarchies of the collateral ligaments starting from the bottom (tropocollagen molecule) to up where the fibred reinforced structure established. Experimental data of failure tensile test were considered as the principal driver of the developed model. This model was calibrated statistically using Bayesian calibration due to the high number of unknown parameters. Then the model is scaled up to fit the real structure of the collateral ligaments and simulated under realistic boundary conditions. Predications have been successful in describing the observed transient response of the collateral ligaments during tensile test under pre- and post-damage loading conditions. Collateral ligaments maximum stresses and strengths were observed near to the femoral insertions, a results that is in good agreement with experimental investigations. Also for the first time, damage initiation and propagation were documented with this model as a function of the cross-link density between tropocollagen molecules.Keywords: multiscale model, tropocollagen, fibrils, ligaments commas
Procedia PDF Downloads 1579383 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor
Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha
Abstract:
The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC
Procedia PDF Downloads 2409382 Development and Obtaining of Solid Dispersions to Increase the Solubility of Efavirenz in Anti-HIV Therapy
Authors: Salvana P. M. Costa, Tarcyla A. Gomes, Giovanna C. R. M. Schver, Leslie R. M. Ferraz, Cristovão R. Silva, Magaly A. M. Lyra, Danilo A. F. Fonte, Larissa A. Rolim, Amanda C. Q. M. Vieira, Miracy M. Albuquerque, Pedro J. Rolim-neto
Abstract:
Efavirenz (EFV) is considered one of the most widely used anti-HIV drugs. However, it is classified as a drug class II (poorly soluble, highly permeable) according to the biopharmaceutical classification system, presenting problems of absorption in the gastrointestinal tract and thereby inadequate bioavailability for its therapeutic action. This study aimed to overcome these barriers by developing and obtaining solid dispersions (SD) in order to increase the EFZ bioavailability. For the development of SD with EFV, theoretical and practical studies were initially performed. Thus, there was a choice of a carrier to be used. For this, it was analyzed the various criteria such as glass transition temperature of the polymer, intra- and intermolecular interactions of hydrogen bonds between drug and polymer, the miscibility between the polymer and EFV. The choice of the obtainment method of the SD came from the analysis of which method is the most consolidated in both industry and literature. Subsequently, the choice of drug and carrier concentrations in the dispersions was carried out. In order to obtain DS to present the drug in its amorphous form, as the DS were obtained, they were analyzed by X-ray diffraction (XRD). SD are more stable the higher the amount of polymer present in the formulation. With this assumption, a SD containing 10% of drug was initially prepared and then this proportion was increased until the XRD showed the presence of EFV in its crystalline form. From this point, it was not produced SD with a higher concentration of drug. Thus, it was allowed to select PVP-K30, PVPVA 64 and the SOLUPLUS formulation as carriers, once it was possible the formation of hydrogen bond between EFV and polymers since these have hydrogen acceptor groups capable of interacting with the donor group of the drug hydrogen. It is worth mentioning also that the films obtained, independent of concentration used, were presented homogeneous and transparent. Thus, it can be said that the EFV is miscible in the three polymers used in the study. The SD and Physical Mixtures (PM) with these polymers were prepared by the solvent method. The EFV diffraction profile showed main peaks at around 2θ of 6,24°, in addition to other minor peaks at 14,34°, 17,08°, 20,3°, 21,36° and 25,06°, evidencing its crystalline character. Furthermore, the polymers showed amorphous nature, as evidenced by the absence of peaks in their XRD patterns. The XRD patterns showed the PM overlapping profile of the drug with the polymer, indicating the presence of EFV in its crystalline form. Regardless the proportion of drug used in SD, all the samples showed the same characteristics with no diffraction peaks EFV, demonstrating the behavior amorphous products. Thus, the polymers enabled, effectively, the formation of amorphous SD, probably due to the potential hydrogen bonds between them and the drug. Moreover, the XRD analysis showed that the polymers were able to maintain its amorphous form in a concentration of up to 80% drug.Keywords: amorphous form, Efavirenz, solid dispersions, solubility
Procedia PDF Downloads 5689381 On the Effect of Carbon on the Efficiency of Titanium as a Hydrogen Storage Material
Authors: Ghazi R. Reda Mahmoud Reda
Abstract:
Among the metal that forms hydride´s, Mg and Ti are known as the most lightweight materials; however, they are covered with a passive layer of oxides and hydroxides and require activation treatment under high temperature ( > 300 C ) and hydrogen pressure ( > 3 MPa) before being used for storage and transport applications. It is well known that small graphite addition to Ti or Mg, lead to a dramatic change in the kinetics of mechanically induced hydrogen sorption ( uptake) and significantly stimulate the Ti-Hydrogen interaction. Many explanations were given by different authors to explain the effect of graphite addition on the performance of Ti as material for hydrogen storage. Not only graphite but also the addition of a polycyclic aromatic compound will also improve the hydrogen absorption kinetics. It will be shown that the function of carbon addition is two-fold. First carbon acts as a vacuum cleaner, which scavenges out all the interstitial oxygen that can poison or slow down hydrogen absorption. It is also important to note that oxygen favors the chemisorption of hydrogen, which is not desirable for hydrogen storage. Second, during scavenging of the interstitial oxygen, the carbon reacts with oxygen in the nano and microchannel through a highly exothermic reaction to produce carbon dioxide and monoxide which provide the necessary heat for activation and thus in the presence of carbon lower heat of activation for hydrogen absorption which is observed experimentally. Furthermore, the product of the reaction of hydrogen with the carbon oxide will produce water which due to ball milling hydrolyze to produce the linear H5O2 + this will reconstruct the primary structure of the nanocarbon to form secondary structure, where the primary structure (a sheet of carbon) are connected through hydrogen bonding. It is the space between these sheets where physisorption or defect mediated sorption occurs.Keywords: metal forming hydrides, polar molecule impurities, titanium, phase diagram, hydrogen absorption
Procedia PDF Downloads 3609380 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation
Authors: Lawrence A. Farinola
Abstract:
Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error
Procedia PDF Downloads 1199379 Evaluation of the Impact of Neuropathic Pain on the Quality of Life of Patients
Authors: A. Ibovi Mouondayi, S. Zaher, R. Assadi, K. Erraoui, S. Sboul, J. Daoudim, S. Bousselham, K. Nassar, S. Janani
Abstract:
Introduction: Neuropathic pain (NP) is chronic pain; it can be observed in a large number of clinical situations. This pain results from a lesion of the peripheral or central nervous system. It is a frequent reason for consultations in rheumatology. This pain being chronic, can become disabling for the patient, thereby altering his quality of life. Objective: The objective of this study was to evaluate the impact of neuropathic pain on the quality of life of patients followed-up for chronic neuropathic pain. Material and Method: This is a monocentric, cross-sectional, descriptive, retrospective study conducted in our department over a period of 19 months from October 2020 to April 2022. The missing parameters were collected during phone calls of the patients concerned. The diagnostic tool adopted was the DN4 questionnaire in the dialectal Arabic version. The impact of NP was assessed by the visual analog scale (VAS) on pain, sleep, and function. The impact of PN on mood was assessed by the hospital anxiety, and depression scale (HAD) score in the validated Arabic version. The exclusion criteria were patients followed up for depression and other psychiatric pathologies. Results: A total of 1528 patient data were collected; the average age of the patients was 57 years (standard deviation: 13 years) with extremes ranging from 17 years to 94 years, 91% were women and 9% men with a sex ratio man/woman equal to 0.10. 67% of our patients were married, and 63% of our patients were housewives. 43% of patients were followed-up for degenerative pathology. The NP was cervical radiculopathy in 26%, lumbosacral radiculopathy in 51%, and carpal tunnel syndrome in 20%. 23% of our patients had poor sleep quality, and 54% had average sleep quality. The pain was very intense in 5% of patients; 33% had severe pain, and 58% had moderate pain. The function was limited in 55% of patients. The average HAD score for anxiety and depression was 4.39 (standard deviation: 2.77) and 3.21 (standard deviation: 2.89), respectively. Conclusion: Our data clearly illustrate that neuropathic pain has a negative impact on the quality of sleep and function, as well as the mood of patients, thus influencing their quality of life.Keywords: neuropathic pain, sleep, quality of life, chronic pain
Procedia PDF Downloads 1289378 Systematic Identification and Quantification of Substrate Specificity Determinants in Human Protein Kinases
Authors: Manuel A. Alonso-Tarajano, Roberto Mosca, Patrick Aloy
Abstract:
Protein kinases participate in a myriad of cellular processes of major biomedical interest. The in vivo substrate specificity of these enzymes is a process determined by several factors, and despite several years of research on the topic, is still far from being totally understood. In the present work, we have quantified the contributions to the kinase substrate specificity of i) the phosphorylation sites and their surrounding residues in the sequence and of ii) the association of kinases to adaptor or scaffold proteins. We have used position-specific scoring matrices (PSSMs), to represent the stretches of sequences phosphorylated by 93 families of kinases. We have found negative correlations between the number of sequences from which a PSSM is generated and the statistical significance and the performance of that PSSM. Using a subset of 22 statistically significant PSSMs, we have identified specificity determinant residues (SDRs) for 86% of the corresponding kinase families. Our results suggest that different SDRs can function as positive or negative elements of substrate recognition by the different families of kinases. Additionally, we have found that human proteins with known function as adaptors or scaffolds (kAS) tend to interact with a significantly large fraction of the substrates of the kinases to which they associate. Based on this characteristic we have identified a set of 279 potential adaptors/scaffolds (pAS) for human kinases, which is enriched in Pfam domains and functional terms tightly related to the proposed function. Moreover, our results show that for 74.6% of the kinase– pAS association found, the pAS colocalize with the substrates of the kinases they are associated to. Finally, we have found evidence suggesting that the association of kinases to adaptors and scaffolds, may contribute significantly to diminish the in vivo substrate crossed- specificity of protein kinases. In general, our results indicate the relevance of several SDRs for both the positive and negative selection of phosphorylation sites by kinase families and also suggest that the association of kinases to pAS proteins may be an important factor for the localization of the enzymes with their set of substrates.Keywords: kinase, phosphorylation, substrate specificity, adaptors, scaffolds, cellular colocalization
Procedia PDF Downloads 3399377 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects
Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang
Abstract:
As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.Keywords: 4D, 5D, 6D, active BIM
Procedia PDF Downloads 2759376 Bright, Dark N-Soliton Solution of Fokas-Lenells Equation Using Hirota Bilinearization Method
Authors: Sagardeep Talukdar, Riki Dutta, Gautam Kumar Saharia, Sudipta Nandy
Abstract:
In non-linear optics, the Fokas-Lenells equation (FLE) is a well-known integrable equation that describes how ultrashort pulses move across the optical fiber. It admits localized wave solutions, just like any other integrable equation. We apply the Hirota bilinearization method to obtain the soliton solution of FLE. The proposed bilinearization makes use of an auxiliary function. We apply the method to FLE with a vanishing boundary condition, that is, to obtain a bright soliton solution. We have obtained bright 1-soliton and 2-soliton solutions and propose a scheme for obtaining an N-soliton solution. We have used an additional parameter that is responsible for the shift in the position of the soliton. Further analysis of the 2-soliton solution is done by asymptotic analysis. In the non-vanishing boundary condition, we obtain the dark 1-soliton solution. We discover that the suggested bilinearization approach, which makes use of the auxiliary function, greatly simplifies the process while still producing the desired outcome. We think that the current analysis will be helpful in understanding how FLE is used in nonlinear optics and other areas of physics.Keywords: asymptotic analysis, fokas-lenells equation, hirota bilinearization method, soliton
Procedia PDF Downloads 1119375 Cultural Transformation in Interior Design in Commercial Space in India
Authors: Siddhi Pedamkar, Reenu Singh
Abstract:
This report is based on how a culture transforms from one era to another era in commercial space. This transformation is observed in commercial as well as residential spaces. The spaces have specific color concepts, surface detailing furniture, and function-specific layouts. But the cultural impact is very rarely seen in commercial spaces, mostly because the interior is divine by function to a large extent. Information was collected from books and research papers. A quantitative survey was conducted to understand people's perceptions about the impact of culture on design entities and how culture dictates the different types of space and their character. The survey also highlights the impact of types of interior lighting, colour schemes, and furniture types on the interior environment. The questionnaire survey helped in framing design parameters for contemporary interior design. The design parameters are used to propose design options for new-age furniture that can be used in co-working spaces. For the new and contemporary working spaces, new age design furniture, interior elements such as visual partition, semi-visual partition, lighting, and layout can be transformed by cultural changes in the working style of people and organization.Keywords: commercial space, culture, environment, furniture, interior
Procedia PDF Downloads 1169374 Improving the Genetic Diversity of Soybean Seeds and Tolerance to Drought Irradiated with Gamma Rays
Authors: Aminah Muchdar
Abstract:
To increase the genetic diversity of soybean in order to adapt to agroecology in Indonesia conducted ways including introduction, cross, mutation and genetic transformation. The purpose of this research is to obtain early maturity soybean mutant lines, large seed tolerant to drought with high yield potential. This study consisted of two stages: the first is sensitivity of gamma rays carried out in the Laboratory BATAN. The genetic variety used is Anjasmoro. The method seeds irradiated with gamma rays at a rate of activity with the old ci 1046.16976 irradiation 0-71 minutes. Irradiation doses of 0, 100, 200, 300, 400, 500, 600, 700, 800, 900 and 1000gy. The results indicated all seeds irradiated with doses of 0 - 1000gy, just a dose of 200 and 300gy are able to show the percentage of germination, plant height, number of leaves, number of normal sprouts and green leaves of the best and can be continued for a second trial in order to assemble and to get mutants which is expected. The result of second stage of soybean M2 Population irradiated with diversity Gamma Irradiation performed that in the form of soybean planting, the seed planted is the first derivative of the M2 irradiated seeds. The result after the age of 30ADP has already showing growth and development of plants that vary when compared to its parent, both in terms of plant height, number of leaves, leaf shape and leaf forage level. In the generative phase, a plant that has been irradiated 200 and 300 gy seen some plants flower form packs, but not formed pods, there is also a form packs of flowers, but few pods produce soybean morphological characters such as plant height, number of branches, pods, days to flowering, harvesting, seed weight and seed number.Keywords: gamma ray, genetic mutation, irradiation, soybean
Procedia PDF Downloads 3999373 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 268