Search results for: condensation function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5025

Search results for: condensation function

4275 Synthesis, Characterization and Catecholase Study of Novel Bidentate Schiff Base Derived from Dehydroacetic Acid

Authors: Salima Tabti, Chaima Maouche, Tinhinene Louaileche, Amel Djedouani, Ismail Warad

Abstract:

Novel Schiff base ligand HL has been synthesized by condensation of aromatic amine and DHA. It was characterized by UV-Vis, FT-IR, SM, NMR (1H, 13C) and also by single-crystal X-ray diffraction. The crystal structure shows that compound crystallized in a triclinic system in P-1 space group and with a two unit per cell (Z = 2).The asymmetric unit, contains one independent molecules, the conformation is determined by an intermolecular N-H…O hydrogen bond with an S(6) ring motif. The molecule have an (E) conformation about the C=N bond. The dihedral angles between the phenyl and pyran ring planes is 89.37 (1), the two plans are approximately perpendicular. The catecholase activity of is situ copper complexes of this ligand has been investigated against catechol. The progress of the oxidation reactions was closely monitored over time following the strong peak of catechol using UV-Vis. Oxidation rates were determined from the initial slope of absorbance. time plots, then analyzed by Michaelis-Menten equations. Catechol oxidation reactions were realized using different concentrations of copper acetate and ligand (L/Cu: 1/1, 1/2, 2/1). The results show that all complexes were able to catalyze the oxidation of catechol. Acetate complexes have the highest activity. Catalysis is a branch of chemical kinetics that, more generally, studies the influence of all physical or chemical factors determining reaction rates. It solves a lot of problems in the chemistry reaction process, especially for a green, economic and less polluting chemistry. For this reason, the search for new catalysts for known organic reactions, occupies a very advanced place in the themes proposed by the chemists.

Keywords: dehydroacetic acid, catechol, copper, catecholase activity, x-ray

Procedia PDF Downloads 86
4274 Health Assessment and Disorders of External Respiration Function among Physicians

Authors: A. G. Margaryan

Abstract:

Aims and Objectives: Assessment of health status and detection disorders of external respiration functions (ERF) during preventative medical examination among physicians of Armenia. Subjects and Methods: Overall, fifty-nine physicians (17 men and 42 women) were examined and spirometry was carried out. The average age of the physicians was 50 years old. The studies were conducted on the Micromedical MicroLab 3500 Spirometer. Results: 25.4% among 59 examined physicians are overweight; 22.0% of them suffer from obesity. Two physicians are currently smokers. About half of the examined physicians (50.8%) at the time of examination were diagnosed with some diseases and had different health-related problems (excluding the problems related to vision and hearing). FVC was 2.94±0.1, FEV1 – 2.64±0.1, PEF – 329.7±19.9, and FEV1%/FVC – 89.7±1.3. Pathological changes of ERF are identified in 23 (39.0%) cases. 28.8% of physicians had first degree of restrictive disorders, 3.4% – first degree of combined obstructive/ restrictive disorders, 6.8% – second degree of combined obstructive/ restrictive disorders. Only three physicians with disorders of the ERF were diagnosed with chronic bronchitis and bronchial asthma. There were no statistically significant changes in ERF depending on the severity of obesity (P> 0.05). Conclusion: The study showed the prevalence of ERF among physicians, observing mainly mild and moderate changes in ERF parameters.

Keywords: Armenia, external respiration function, health status, physicians

Procedia PDF Downloads 184
4273 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 376
4272 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People

Authors: Marlene Rosa, Susana Lopes

Abstract:

There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.

Keywords: board game, aging, executive function, evaluation

Procedia PDF Downloads 127
4271 Topology Optimization of the Interior Structures of Beams under Various Load and Support Conditions with Solid Isotropic Material with Penalization Method

Authors: Omer Oral, Y. Emre Yilmaz

Abstract:

Topology optimization is an approach that optimizes material distribution within a given design space for a certain load and boundary conditions by providing performance goals. It uses various restrictions such as boundary conditions, set of loads, and constraints to maximize the performance of the system. It is different than size and shape optimization methods, but it reserves some features of both methods. In this study, interior structures of the parts were optimized by using SIMP (Solid Isotropic Material with Penalization) method. The volume of the part was preassigned parameter and minimum deflection was the objective function. The basic idea behind the theory was considered, and different methods were discussed. Rhinoceros 3D design tool was used with Grasshopper and TopOpt plugins to create and optimize parts. A Grasshopper algorithm was designed and tested for different beams, set of arbitrary located forces and support types such as pinned, fixed, etc. Finally, 2.5D shapes were obtained and verified by observing the changes in density function.

Keywords: Grasshopper, lattice structure, microstructures, Rhinoceros, solid isotropic material with penalization method, TopOpt, topology optimization

Procedia PDF Downloads 114
4270 Investigation into the Optimum Hydraulic Loading Rate for Selected Filter Media Packed in a Continuous Upflow Filter

Authors: A. Alzeyadi, E. Loffill, R. Alkhaddar

Abstract:

Continuous upflow filters can combine the nutrient (nitrogen and phosphate) and suspended solid removal in one unit process. The contaminant removal could be achieved chemically or biologically; in both processes the filter removal efficiency depends on the interaction between the packed filter media and the influent. In this paper a residence time distribution (RTD) study was carried out to understand and compare the transfer behaviour of contaminants through a selected filter media packed in a laboratory-scale continuous up flow filter; the selected filter media are limestone and white dolomite. The experimental work was conducted by injecting a tracer (red drain dye tracer –RDD) into the filtration system and then measuring the tracer concentration at the outflow as a function of time; the tracer injection was applied at hydraulic loading rates (HLRs) (3.8 to 15.2 m h-1). The results were analysed according to the cumulative distribution function F(t) to estimate the residence time of the tracer molecules inside the filter media. The mean residence time (MRT) and variance σ2 are two moments of RTD that were calculated to compare the RTD characteristics of limestone with white dolomite. The results showed that the exit-age distribution of the tracer looks better at HLRs (3.8 to 7.6 m h-1) and (3.8 m h-1) for limestone and white dolomite respectively. At these HLRs the cumulative distribution function F(t) revealed that the residence time of the tracer inside the limestone was longer than in the white dolomite; whereas all the tracer took 8 minutes to leave the white dolomite at 3.8 m h-1. On the other hand, the same amount of the tracer took 10 minutes to leave the limestone at the same HLR. In conclusion, the determination of the optimal level of hydraulic loading rate, which achieved the better influent distribution over the filtration system, helps to identify the applicability of the material as filter media. Further work will be applied to examine the efficiency of the limestone and white dolomite for phosphate removal by pumping a phosphate solution into the filter at HLRs (3.8 to 7.6 m h-1).

Keywords: filter media, hydraulic loading rate, residence time distribution, tracer

Procedia PDF Downloads 262
4269 Oat βeta Glucan Attenuates the Development of Atherosclerosis and Improves the Intestinal Barrier Function by Reducing Bacterial Endotoxin Translocation in APOE-/- MICE

Authors: Dalal Alghawas, Jetty Lee, Kaisa Poutanen, Hani El-Nezami

Abstract:

Oat β-glucan a water soluble non starch linear polysaccharide has been approved as a cholesterol lowering agent by various food safety administrations and is commonly used to reduce the risk of heart disease. The molecular weight of oat β-glucan can vary depending on the extraction and fractionation methods. It is not clear whether the molecular weight has a significant impact at reducing the acceleration of atherosclerosis. The aim of this study was to investigate three different oat β-glucan fractionations on the development of atherosclerosis in vivo. With special focus on plaque stability and the intestinal barrier function. To test this, ApoE-/- female mice were fed a high fat diet supplemented with oat bran, high molecular weight (HMW) oat β-glucan fractionate and low molecular weight (LMW) oat β-glucan fractionate for 16 weeks. Atherosclerosis risk markers were measured in the plasma, heart and aortic tree. Plaque size was measured in the aortic root and aortic tree. ICAM-1, VCAM-1, E-Selectin, P-Selectin, protein levels were assessed from the aortic tree to determine plaque stability at 16 weeks. The expression of p22phox at the aortic root was evaluated to study the NADPH oxidase complex involved in nitric oxide bioavailability and vascular elasticity. The tight junction proteins E-cadherin and beta-catenin from western blot analyses were analysed as an intestinal barrier function test. Plasma LPS, intestinal D-lactate levels and hepatic FMO gene expression were carried out to confirm whether the compromised intestinal barrier lead to endotoxemia. The oat bran and HMW oat β-glucan diet groups were more effective than the LMW β-glucan diet group at reducing the plaque size and showed marked improvements in plaque stability. The intestinal barrier was compromised for all the experimental groups however the endotoxemia levels were higher in the LMW β-glucan diet group. The oat bran and HMW oat β-glucan diet groups were more effective at attenuating the development of atherosclerosis. Reasons for this could be due to the LMW oat β-glucan diet group’s low viscosity in the gut and the inability to block the reabsorption of cholesterol. Furthermore the low viscosity may allow more bacterial endotoxin translocation through the impaired intestinal barrier. In future food technologists should carefully consider how to incorporate LMW oat β-glucan as a health promoting food.

Keywords: Atherosclerosis, beta glucan, endotoxemia, intestinal barrier function

Procedia PDF Downloads 399
4268 Executive Deficits in Non-Clinical Hoarders

Authors: Thomas Heffernan, Nick Neave, Colin Hamilton, Gill Case

Abstract:

Hoarding is the acquisition of and failure to discard possessions, leading to excessive clutter and significant psychological/emotional distress. From a cognitive-behavioural approach, excessive hoarding arises from information-processing deficits, as well as from problems with emotional attachment to possessions and beliefs about the nature of possessions. In terms of information processing, hoarders have shown deficits in executive functions, including working memory, planning, inhibitory control, and cognitive flexibility. However, this previous research is often confounded by co-morbid factors such as anxiety, depression, or obsessive-compulsive disorder. The current study adopted a cognitive-behavioural approach, specifically assessing executive deficits and working memory in a non-clinical sample of hoarders, compared with non-hoarders. In this study, a non-clinical sample of 40 hoarders and 73 non-hoarders (defined by The Savings Inventory-Revised) completed the Adult Executive Functioning Inventory, which measures working memory and inhibition, Dysexecutive Questionnaire-Revised, which measures general executive function and the Hospital Anxiety and Depression Scale, which measures mood. The participant sample was made up of unpaid young adult volunteers who were undergraduate students and who completed the questionnaires on a university campus. The results revealed that, after observing no differences between hoarders and non-hoarders on age, sex, and mood, hoarders reported significantly more deficits in inhibitory control and general executive function when compared with non-hoarders. There was no between-group difference on general working memory. This suggests that non-clinical hoarders have a specific difficulty with inhibition-control, which enables you to resist repeated, unwanted urges. This might explain the hoarder’s inability to resist urges to buy and keep items that are no longer of any practical use. These deficits may be underpinned by general executive function deficiencies.

Keywords: hoarding, memory, executive, deficits

Procedia PDF Downloads 175
4267 Designing Function Knitted and Woven Upholstery Textile With SCOPY Film

Authors: Manar Y. Abd El-Aziz, Alyaa E. Morgham, Amira A. El-Fallal, Heba Tolla E. Abo El Naga

Abstract:

Different textile materials are usually used in upholstery. However, upholstery parts may become unhealthy when dust accrues and bacteria raise on the surface, which negatively affects the user's health. Also, leather and artificial leather were used in upholstery but, leather has a high cost and artificial leather has a potential chemical risk for users. Researchers have advanced vegie leather made from bacterial cellulose a symbiotic culture of bacteria and yeast (SCOBY). SCOBY remains a gelatinous, cellulose biofilm discovered floating at the air-liquid interface of the container. But this leather still needs some enhancement for its mechanical properties. This study aimed to prepare SCOBY, produce bamboo rib knitted fabrics with two different stitch densities, and cotton woven fabric then laminate these fabrics with the prepared SCOBY film to enhance the mechanical properties of the SCOBY leather at the same time; add anti-microbial function to the prepared fabrics. Laboratory tests were conducted on the produced samples, including tests for function properties; anti-microbial, thermal conductivity and light transparency. Physical properties; thickness and mass per unit. Mechanical properties; elongation, tensile strength, young modulus, and peel force. The results showed that the type of the fabric affected significantly SCOBY properties. According to the test results, the bamboo knitted fabric with higher stitch density laminated with SCOBY was chosen for its tensile strength and elongation as the upholstery of a bed model with antimicrobial properties and comfortability in the headrest design. Also, the single layer of SCOBY was chosen regarding light transparency and lower thermal conductivity for the creation of a lighting unit built into the bed headboard.

Keywords: anti-microbial, bamboo, rib, SCOPY, upholstery

Procedia PDF Downloads 49
4266 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas

Authors: Sahithi Yarlagadda

Abstract:

The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.

Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm

Procedia PDF Downloads 94
4265 Role of Zinc Adminstration in Improvement of Faltering Growth in Egyption Children at Risk of Environmental Enteric Dysfunction

Authors: Ghada Mahmoud El Kassas, Maged Atta El Wakeel

Abstract:

Background: Environmental enteric dysfunction (EED) is impending trouble that flared up in the last decades to be pervasive in infants and children. EED is asymptomatic villous atrophy of the small bowel that is prevalent in the developing world and is associated with altered intestinal function and integrity. Evidence has suggested that supplementary zinc might ameliorate this damage by reducing gastrointestinal inflammation and may also benefit cognitive development. Objective: We tested whether zinc supplementation improves intestinal integrity, growth, and cognitive function in stunted children predicted to have EED. Methodology: This case–control prospective interventional study was conducted on 120 Egyptian Stunted children aged 1-10 years who recruited from the Nutrition clinic, the National research center, and 100 age and gender-matched healthy children as controls. At the primary phase of the study, Full history taking, clinical examination, and anthropometric measurements were done. Standard deviation score (SDS) for all measurements were calculated. Serum markers as Zonulin, Endotoxin core antibody (EndoCab), highly sensitive C-reactive protein (hsCRP), alpha1-acid glycoprotein (AGP), Tumor necrosis factor (TNF), and fecal markers such as myeloperoxidase (MPO), neopterin (NEO), and alpha-1-anti-trypsin (AAT) (as predictors of EED) were measured. Cognitive development was assessed (Bayley or Wechsler scores). Oral zinc at a dosage of 20 mg/d was supplemented to all cases and followed up for 6 months, after which the 2ry phase of the study included the previous clinical, laboratory, and cognitive assessment. Results: Serum and fecal inflammatory markers were significantly higher in cases compared to controls. Zonulin (P < 0.01), (EndoCab) (P < 0.001) and (AGP) (P < 0.03) markedly decreased in cases at the end of 2ry phase. Also (MPO), (NEO), and (AAT) showed a significant decline in cases at the end of the study (P < 0.001 for all). A significant increase in mid-upper arm circumference (MUAC) (P < 0.01), weight for age z-score, and skinfold thicknesses (P< 0.05 for both) was detected at end of the study, while height was not significantly affected. Cases also showed significant improvement of cognitive function at phase 2 of the study. Conclusion: Intestinal inflammatory state related to EED showed marked recovery after zinc supplementation. As a result, anthropometric and cognitive parameters showed obvious improvement with zinc supplementation.

Keywords: stunting, cognitive function, environmental enteric dysfunction, zinc

Procedia PDF Downloads 175
4264 Shuffled Structure for 4.225 GHz Antireflective Plates: A Proposal Proven by Numerical Simulation

Authors: Shin-Ku Lee, Ming-Tsu Ho

Abstract:

A newly proposed antireflective selector with shuffled structure is reported in this paper. The proposed idea is made of two different quarter wavelength (QW) slabs and numerically supported by the one-dimensional simulation results provided by the method of characteristics (MOC) to function as an antireflective selector. These two QW slabs are characterized by dielectric constants εᵣA and εᵣB, uniformly divided into N and N+1 pieces respectively which are then shuffled to form an antireflective plate with B(AB)N structure such that there is always one εᵣA piece between two εᵣB pieces. Another is A(BA)N structure where every εᵣB piece is sandwiched by two εᵣA pieces. Both proposed structures are numerically proved to function as QW plates. In order to allow maximum transmission through the proposed structures, the two dielectric constants are chosen to have the relation of (εᵣA)² = εᵣB > 1. The advantages of the proposed structures over the traditional anti-reflection coating techniques are two components with two thicknesses and to shuffle to form new QW structures. The design wavelength used to validate the proposed idea is 71 mm corresponding to a frequency about 4.225 GHz. The computational results are shown in both time and frequency domains revealing that the proposed structures produce minimum reflections around the frequency of interest.

Keywords: method of characteristics, quarter wavelength, anti-reflective plate, propagation of electromagnetic fields

Procedia PDF Downloads 132
4263 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application

Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid

Abstract:

Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.

Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization

Procedia PDF Downloads 381
4262 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 161
4261 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 71
4260 Comparative Analysis of Islamic Bank in Indonesia and Malaysia with Risk Profile, Good Corporate Governance, Earnings, and Capital Method: Performance of Business Function and Social Function Perspective

Authors: Achsania Hendratmi, Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum

Abstract:

This study aims to compare and see the differences between Islamic bank in Indonesia and Islamic bank in Malaysia using RGEC method (Risk Profile, Good Corporate Governance, Earnings, and Capital). This study examines the comparison in business and social performance of eleven Islamic banks in Indonesia and fifteen Islamic banks in Malaysia. This research used quantitative approach and the collections of data was done by collecting all the annual reports of banks that has been created as a sample over the period 2011-2015. The test result of the Independent Samples T-test and Mann-Whitney Test showed there were differences in the business performance of Islamic Bank in Indonesia and Malaysia as seen from the aspect of Risk profile (FDR), GCG, and Earnings (ROA). Also, there were differences of business and social performance as seen from Earnings (ROE), Capital (CAR), and Sharia Conformity Indicator (PSR and ZR) aspects.

Keywords: business performance, Islamic banks, RGEC, social performance

Procedia PDF Downloads 279
4259 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 644
4258 Deepnic, A Method to Transform Each Variable into Image for Deep Learning

Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.

Abstract:

Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.

Keywords: tabular data, deep learning, perfect trees, NICS

Procedia PDF Downloads 73
4257 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities

Authors: Retius Chifurira

Abstract:

Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.

Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities

Procedia PDF Downloads 182
4256 Cloning and Expression of Azurin: A Protein Having Antitumor and Cell Penetrating Ability

Authors: Mohsina Akhter

Abstract:

Cancer has become a wide spread disease around the globe and takes many lives every year. Different treatments are being practiced but all have potential side effects with somewhat less specificity towards target sites. Pseudomonas aeruginosa is known to secrete a protein azurin with special anti-cancer function. It has unique cell penetrating peptide comprising of 18 amino acids that have ability to enter cancer cells specifically. Reported function of Azurin is to stabilize p53 inside the tumor cells and induces apoptosis through Bax mediated cytochrome c release from mitochondria. At laboratory scale, we have made recombinant azurin through cloning rpTZ57R/T-azu vector into E.coli strain DH-5α and subcloning rpET28-azu vector into E.coli BL21-CodonPlus (DE3). High expression was ensured with IPTG induction at different concentrations then optimized high expression level at 1mM concentration of IPTG for 5 hours. Purification has been done by using Ni+2 affinity chromatography. We have concluded that azurin can be a remarkable improvement in cancer therapeutics if it produces on a large scale. Azurin does not enter into the normal cells so it will prove a safe and secure treatment for patients and prevent them from hazardous anomalies.

Keywords: azurin, pseudomonas aeruginosa, cancer, therapeutics

Procedia PDF Downloads 295
4255 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots

Authors: Meng Wu

Abstract:

Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.

Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization

Procedia PDF Downloads 126
4254 Optimizing the Public Policy Information System under the Environment of E-Government

Authors: Qian Zaijian

Abstract:

E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.

Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems

Procedia PDF Downloads 838
4253 Multiscale Syntheses of Knee Collateral Ligament Stresses: Aggregate Mechanics as a Function of Molecular Properties

Authors: Raouf Mbarki, Fadi Al Khatib, Malek Adouni

Abstract:

Knee collateral ligaments play a significant role in restraining excessive frontal motion (varus/valgus rotations). In this investigation, a multiscale frame was developed based on structural hierarchies of the collateral ligaments starting from the bottom (tropocollagen molecule) to up where the fibred reinforced structure established. Experimental data of failure tensile test were considered as the principal driver of the developed model. This model was calibrated statistically using Bayesian calibration due to the high number of unknown parameters. Then the model is scaled up to fit the real structure of the collateral ligaments and simulated under realistic boundary conditions. Predications have been successful in describing the observed transient response of the collateral ligaments during tensile test under pre- and post-damage loading conditions. Collateral ligaments maximum stresses and strengths were observed near to the femoral insertions, a results that is in good agreement with experimental investigations. Also for the first time, damage initiation and propagation were documented with this model as a function of the cross-link density between tropocollagen molecules.

Keywords: multiscale model, tropocollagen, fibrils, ligaments commas

Procedia PDF Downloads 142
4252 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor

Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha

Abstract:

The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.

Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC

Procedia PDF Downloads 220
4251 Similarity Solutions of Nonlinear Stretched Biomagnetic Flow and Heat Transfer with Signum Function and Temperature Power Law Geometries

Authors: M. G. Murtaza, E. E. Tzirtzilakis, M. Ferdows

Abstract:

Biomagnetic fluid dynamics is an interdisciplinary field comprising engineering, medicine, and biology. Bio fluid dynamics is directed towards finding and developing the solutions to some of the human body related diseases and disorders. This article describes the flow and heat transfer of two dimensional, steady, laminar, viscous and incompressible biomagnetic fluid over a non-linear stretching sheet in the presence of magnetic dipole. Our model is consistent with blood fluid namely biomagnetic fluid dynamics (BFD). This model based on the principles of ferrohydrodynamic (FHD). The temperature at the stretching surface is assumed to follow a power law variation, and stretching velocity is assumed to have a nonlinear form with signum function or sign function. The governing boundary layer equations with boundary conditions are simplified to couple higher order equations using usual transformations. Numerical solutions for the governing momentum and energy equations are obtained by efficient numerical techniques based on the common finite difference method with central differencing, on a tridiagonal matrix manipulation and on an iterative procedure. Computations are performed for a wide range of the governing parameters such as magnetic field parameter, power law exponent temperature parameter, and other involved parameters and the effect of these parameters on the velocity and temperature field is presented. It is observed that for different values of the magnetic parameter, the velocity distribution decreases while temperature distribution increases. Besides, the finite difference solutions results for skin-friction coefficient and rate of heat transfer are discussed. This study will have an important bearing on a high targeting efficiency, a high magnetic field is required in the targeted body compartment.

Keywords: biomagnetic fluid, FHD, MHD, nonlinear stretching sheet

Procedia PDF Downloads 145
4250 Maintenance Performance Measurement Derived Optimization: A Case Study

Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Stanley Mburu

Abstract:

Maintenance performance measurement (MPM) represents an integrated aspect that considers both operational and maintenance related aspects while evaluating the effectiveness and efficiency of maintenance to ensure assets are working as they should. Three salient issues require to be addressed for an asset-intensive organization to employ an MPM-based framework to optimize maintenance. Firstly, the organization should establish important perfomance metric(s), in this case the maintenance objective(s), which they will be focuss on. The second issue entails aligning the maintenance objective(s) with maintenance optimization. This is achieved by deriving maintenance performance indicators that subsequently form an objective function for the optimization program. Lastly, the objective function is employed in an optimization program to derive maintenance decision support. In this study, we develop a framework that initially identifies the crucial maintenance performance measures, and employs them to derive maintenance decision support. The proposed framework is demonstrated in a case study of a geothermal drilling rig, where the objective function is evaluated utilizing a simulation-based model whose parameters are derived from empirical maintenance data. Availability, reliability and maintenance inventory are depicted as essential objectives requiring further attention. A simulation model is developed mimicking a drilling rig operations and maintenance where the sub-systems are modelled undergoing imperfect maintenance, corrective (CM) and preventive (PM), with the total cost as the primary performance measurement. Moreover, three maintenance spare inventory policies are considered; classical (retaining stocks for a contractual period), vendor-managed inventory with consignment stock and periodic monitoring order-to-stock (s, S) policy. Optimization results infer that the adoption of (s, S) inventory policy, increased PM interval and reduced reliance of CM actions offers improved availability and total costs reduction.

Keywords: maintenance, vendor-managed, decision support, performance, optimization

Procedia PDF Downloads 107
4249 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation

Authors: Lawrence A. Farinola

Abstract:

Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.

Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error

Procedia PDF Downloads 105
4248 Evaluation of the Impact of Neuropathic Pain on the Quality of Life of Patients

Authors: A. Ibovi Mouondayi, S. Zaher, R. Assadi, K. Erraoui, S. Sboul, J. Daoudim, S. Bousselham, K. Nassar, S. Janani

Abstract:

Introduction: Neuropathic pain (NP) is chronic pain; it can be observed in a large number of clinical situations. This pain results from a lesion of the peripheral or central nervous system. It is a frequent reason for consultations in rheumatology. This pain being chronic, can become disabling for the patient, thereby altering his quality of life. Objective: The objective of this study was to evaluate the impact of neuropathic pain on the quality of life of patients followed-up for chronic neuropathic pain. Material and Method: This is a monocentric, cross-sectional, descriptive, retrospective study conducted in our department over a period of 19 months from October 2020 to April 2022. The missing parameters were collected during phone calls of the patients concerned. The diagnostic tool adopted was the DN4 questionnaire in the dialectal Arabic version. The impact of NP was assessed by the visual analog scale (VAS) on pain, sleep, and function. The impact of PN on mood was assessed by the hospital anxiety, and depression scale (HAD) score in the validated Arabic version. The exclusion criteria were patients followed up for depression and other psychiatric pathologies. Results: A total of 1528 patient data were collected; the average age of the patients was 57 years (standard deviation: 13 years) with extremes ranging from 17 years to 94 years, 91% were women and 9% men with a sex ratio man/woman equal to 0.10. 67% of our patients were married, and 63% of our patients were housewives. 43% of patients were followed-up for degenerative pathology. The NP was cervical radiculopathy in 26%, lumbosacral radiculopathy in 51%, and carpal tunnel syndrome in 20%. 23% of our patients had poor sleep quality, and 54% had average sleep quality. The pain was very intense in 5% of patients; 33% had severe pain, and 58% had moderate pain. The function was limited in 55% of patients. The average HAD score for anxiety and depression was 4.39 (standard deviation: 2.77) and 3.21 (standard deviation: 2.89), respectively. Conclusion: Our data clearly illustrate that neuropathic pain has a negative impact on the quality of sleep and function, as well as the mood of patients, thus influencing their quality of life.

Keywords: neuropathic pain, sleep, quality of life, chronic pain

Procedia PDF Downloads 116
4247 Systematic Identification and Quantification of Substrate Specificity Determinants in Human Protein Kinases

Authors: Manuel A. Alonso-Tarajano, Roberto Mosca, Patrick Aloy

Abstract:

Protein kinases participate in a myriad of cellular processes of major biomedical interest. The in vivo substrate specificity of these enzymes is a process determined by several factors, and despite several years of research on the topic, is still far from being totally understood. In the present work, we have quantified the contributions to the kinase substrate specificity of i) the phosphorylation sites and their surrounding residues in the sequence and of ii) the association of kinases to adaptor or scaffold proteins. We have used position-specific scoring matrices (PSSMs), to represent the stretches of sequences phosphorylated by 93 families of kinases. We have found negative correlations between the number of sequences from which a PSSM is generated and the statistical significance and the performance of that PSSM. Using a subset of 22 statistically significant PSSMs, we have identified specificity determinant residues (SDRs) for 86% of the corresponding kinase families. Our results suggest that different SDRs can function as positive or negative elements of substrate recognition by the different families of kinases. Additionally, we have found that human proteins with known function as adaptors or scaffolds (kAS) tend to interact with a significantly large fraction of the substrates of the kinases to which they associate. Based on this characteristic we have identified a set of 279 potential adaptors/scaffolds (pAS) for human kinases, which is enriched in Pfam domains and functional terms tightly related to the proposed function. Moreover, our results show that for 74.6% of the kinase– pAS association found, the pAS colocalize with the substrates of the kinases they are associated to. Finally, we have found evidence suggesting that the association of kinases to adaptors and scaffolds, may contribute significantly to diminish the in vivo substrate crossed- specificity of protein kinases. In general, our results indicate the relevance of several SDRs for both the positive and negative selection of phosphorylation sites by kinase families and also suggest that the association of kinases to pAS proteins may be an important factor for the localization of the enzymes with their set of substrates.

Keywords: kinase, phosphorylation, substrate specificity, adaptors, scaffolds, cellular colocalization

Procedia PDF Downloads 329
4246 The Digital Living Archive and the Construction of a Participatory Cultural Memory in the DARE-UIA Project: Digital Environment for Collaborative Alliances to Regenerate Urban Ecosystems in Middle-Sized Cities

Authors: Giulia Cardoni, Francesca Fabbrii

Abstract:

Living archives perform a function of social memory sharing, which contributes to building social bonds, communities, and identities. This potential lies in the ability to live archives to put together an archival function, which allows the conservation and transmission of memory with an artistic, performative and creative function linked to the present. As part of the DARE-UIA (Digital environment for collaborative alliances to regenerate urban ecosystems in middle-sized cities) project the creation of a living digital archive made it possible to create a narrative that would consolidate the cultural memory of the Darsena district of the city of Ravenna. The aim of the project is to stimulate the urban regeneration of a suburban area of a city, enhancing its cultural memory and identity heritage through digital heritage tools. The methodology used involves various digital storytelling actions necessary for the overall narrative using georeferencing systems (GIS), storymaps and 3D reconstructions for a transversal narration of historical content such as personal and institutional historical photos and to enhance the industrial archeology heritage of the neighborhood. The aim is the creation of an interactive and replicable narrative in similar contexts to the Darsena district in Ravenna. The living archive, in which all the digital contents are inserted, finds its manifestation towards the outside in the form of a museum spread throughout the neighborhood, making the contents usable on smartphones via QR codes and totems inserted on-site, creating thematic itineraries spread around the neighborhood. The construction of an interactive and engaging digital narrative has made it possible to enhance the material and immaterial heritage of the neighborhood by recreating the community that has historically always distinguished it.

Keywords: digital living archive, digital storytelling, GIS, 3D, open-air museum, urban regeneration, cultural memory

Procedia PDF Downloads 89