Search results for: objective function
10613 Price Promotions and Inventory Decisions
Authors: George Hadjinicola, Andreas Soteriou
Abstract:
This paper examines the relationship between the number of price promotions that a firm should conduct per year and the level of safety stocks that the firm should maintain. Price promotions result in temporary sales increases, which affect the operations function through (1) an increase in the quantities demanded and (2) an increase in safety stocks required to maintain the desired service level. We propose a modeling framework where both price promotions and improved service levels, operationalized through higher safety stocks, can affect sales. We treat the annual number of promotions as a decision variable. We identify market conditions where the operations function, through improved safety stocks, can complement price promotions or even play the leading role in sales increases.Keywords: price promotions, safety stocks, marketing/operations interface, mathematical model
Procedia PDF Downloads 9510612 Prospective Study of the Evaluation of Autologous Blood Injection in the Treatment of Lateral Epicondylitis
Authors: Bheeshma B., Mathivanan N., Manoj Deepak M., Prabhu Thangaraju, K. Venkatachalam
Abstract:
This study involves the effect of autologous blood injection for patients who had degeneration of the origin of extensor carpi radialis brevis which was confirmed radio logically and by ultrasound examination and failed cortisone injections to the lateral epicondylitis. In this prospective longitudinal series involves pre-injection assessment of grip strength, pain, and function, using the patient-rated tennis elbow evaluation. In this study, blood from the contralateral limb is taken and injected into the affected limb with the help of ultrasound guidance and then the patient wore a customized wrist support for five days, after which they were commenced with stretching, strengthening, and massage programme with an occupational therapist. In these patients assessment was done after six months and then finally at 12 months after injection, using the patient-rated tennis elbow evaluation. 50 patients completed the study, showing significant improvement in pain; the worst pain decreased by two to five points out of a 10-point visual analogue for pain. Self-perceived function improved by 11–25 points out of 100. Women showed significant increase in grip, but men did not. Our study thus concludes that autologous blood injection show significant improvement in pain and function in patients with chronic lateral epicondylitis, who did not have relief with cortisone injection.Keywords: lateral epicondylitis, autologous blood injection, conservative treatment, plasma-rich proteins (PRPs)
Procedia PDF Downloads 42810611 Patent on Brian: Brain Waves Stimulation
Authors: Jalil Qoulizadeh, Hasan Sadeghi
Abstract:
Brain waves are electrical wave patterns that are produced in the human brain. Knowing these waves and activating them can have a positive effect on brain function and ultimately create an ideal life. The brain has the ability to produce waves from 0.1 to above 65 Hz. (The Beta One device produces exactly these waves) This is because it is said that the waves produced by the Beta One device exactly match the waves produced by the brain. The function and method of this device is based on the magnetic stimulation of the brain. The technology used in the design and producƟon of this device works in a way to strengthen and improve the frequencies of brain waves with a pre-defined algorithm according to the type of requested function, so that the person can access the expected functions in life activities. to perform better. The effect of this field on neurons and their stimulation: In order to evaluate the effect of this field created by the device, on the neurons, the main tests are by conducting electroencephalography before and after stimulation and comparing these two baselines by qEEG or quantitative electroencephalography method using paired t-test in 39 subjects. It confirms the significant effect of this field on the change of electrical activity recorded after 30 minutes of stimulation in all subjects. The Beta One device is able to induce the appropriate pattern of the expected functions in a soft and effective way to the brain in a healthy and effective way (exactly in accordance with the harmony of brain waves), the process of brain activities first to a normal state and then to a powerful one. Production of inexpensive neuroscience equipment (compared to existing rTMS equipment) Magnetic brain stimulation for clinics - homes - factories and companies - professional sports clubs.Keywords: stimulation, brain, waves, betaOne
Procedia PDF Downloads 8110610 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 38710609 Multi-Objective Optimization of Run-of-River Small-Hydropower Plants Considering Both Investment Cost and Annual Energy Generation
Authors: Amèdédjihundé H. J. Hounnou, Frédéric Dubas, François-Xavier Fifatin, Didier Chamagne, Antoine Vianou
Abstract:
This paper presents the techno-economic evaluation of run-of-river small-hydropower plants. In this regard, a multi-objective optimization procedure is proposed for the optimal sizing of the hydropower plants, and NSGAII is employed as the optimization algorithm. Annual generated energy and investment cost are considered as the objective functions, and number of generator units (n) and nominal turbine flow rate (QT) constitute the decision variables. Site of Yeripao in Benin is considered as the case study. We have categorized the river of this site using its environmental characteristics: gross head, and first quartile, median, third quartile and mean of flow. Effects of each decision variable on the objective functions are analysed. The results gave Pareto Front which represents the trade-offs between annual energy generation and the investment cost of hydropower plants, as well as the recommended optimal solutions. We noted that with the increase of the annual energy generation, the investment cost rises. Thus, maximizing energy generation is contradictory with minimizing the investment cost. Moreover, we have noted that the solutions of Pareto Front are grouped according to the number of generator units (n). The results also illustrate that the costs per kWh are grouped according to the n and rise with the increase of the nominal turbine flow rate. The lowest investment costs per kWh are obtained for n equal to one and are between 0.065 and 0.180 €/kWh. Following the values of n (equal to 1, 2, 3 or 4), the investment cost and investment cost per kWh increase almost linearly with increasing the nominal turbine flowrate while annual generated. Energy increases logarithmically with increasing of the nominal turbine flowrate. This study made for the Yeripao river can be applied to other rivers with their own characteristics.Keywords: hydropower plant, investment cost, multi-objective optimization, number of generator units
Procedia PDF Downloads 15710608 The Role Collagen VI Plays in Heart Failure: A Tale Untold
Authors: Summer Hassan, David Crossman
Abstract:
Myocardial fibrosis (MF) has been loosely defined as the process occurring in the pathological remodeling of the myocardium due to excessive production and deposition of extracellular matrix (ECM) proteins, including collagen. This reduces tissue compliance and accelerates progression to heart failure, as well as affecting the electrical properties of the myocytes resulting in arrhythmias. Microscopic interrogation of MF is key to understanding the molecular orchestrators of disease. It is well-established that recruitment and stimulation of myofibroblasts result in Collagen deposition and the resulting expansion in the ECM. Many types of Collagens have been identified and implicated in scarring of tissue. In a series of experiments conducted at our lab, we aim to elucidate the role collagen VI plays in the development of myocardial fibrosis and its direct impact on myocardial function. This was investigated through an animal experiment in Rats with Collagen VI knockout diseased and healthy animals as well as Collagen VI wild diseased and healthy rats. Echocardiogram assessments of these rats ensued at four-time points, followed by microscopic interrogation of the myocardium aiming to correlate the role collagen VI plays in myocardial function. Our results demonstrate a deterioration in cardiac function as represented by the ejection fraction in the knockout healthy and diseased rats. This elucidates a potential protective role that collagen-VI plays following a myocardial insult. Current work is dedicated to the microscopic characterisation of the fibrotic process in all rat groups, with the results to follow.Keywords: heart failure, myocardial fibrosis, collagen, echocardiogram, confocal microscopy
Procedia PDF Downloads 8210607 Screening for Non-hallucinogenic Neuroplastogens as Drug Candidates for the Treatment of Anxiety, Depression, and Posttraumatic Stress Disorder
Authors: Jillian M. Hagel, Joseph E. Tucker, Peter J. Facchini
Abstract:
With the aim of establishing a holistic approach for the treatment of central nervous system (CNS) disorders, we are pursuing a drug development program rapidly progressing through discovery and characterization phases. The drug candidates identified in this program are referred to as neuroplastogens owing to their ability to mediate neuroplasticity, which can be beneficial to patients suffering from anxiety, depression, or posttraumatic stress disorder. These and other related neuropsychiatric conditions are associated with the onset of neuronal atrophy, which is defined as a reduction in the number and/or productivity of neurons. The stimulation of neuroplasticity results in an increase in the connectivity between neurons and promotes the restoration of healthy brain function. We have synthesized a substantial catalogue of proprietary indolethylamine derivatives based on the general structures of serotonin (5-hydroxytryptamine) and psychedelic molecules such as N,N-dimethyltryptamine (DMT) and psilocin (4-hydroxy-DMT) that function as neuroplastogens. A primary objective in our screening protocol is the identification of derivatives associated with a significant reduction in hallucination, which will allow administration of the drug at a dose that induces neuroplasticity and triggers other efficacious outcomes in the treatment of targeted CNS disorders but which does not cause a psychedelic response in the patient. Both neuroplasticity and hallucination are associated with engagement of the 5HT2A receptor, requiring drug candidates differentially coupled to these two outcomes at a molecular level. We use novel and proprietary artificial intelligence algorithms to predict the mode of binding to the 5HT2A receptor, which has been shown to correlate with the hallucinogenic response. Hallucination is tested using the mouse head-twitch response model, whereas mouse marble-burying and sucrose preference assays are used to evaluate anxiolytic and anti-depressive potential. Neuroplasticity is assays using dendritic outgrowth assays and cell-based ELISA analysis. Pharmacokinetics and additional receptor-binding analyses also contribute the selection of lead candidates. A summary of the program is presented.Keywords: neuroplastogen, non-hallucinogenic, drug development, anxiety, depression, PTSD, indolethylamine derivatives, psychedelic-inspired, 5-HT2A receptor, computational chemistry, head-twitch response behavioural model, neurite outgrowth assay
Procedia PDF Downloads 13810606 Hepatological Alterations in Market Gardeners Occupationally Exposed to Pesticides in the Western Highlands of Cameroon
Authors: M. G. Tanga, P. B. Telefo, D. N. Tarla
Abstract:
Even though the WHO, the EPA and other regulatory bodies have recognized the effects of acute pesticide poisoning little data exists on health effects after long-term low-dose exposures especially in Africa and Cameroon. The aim of this study was to evaluate the impact of pesticides on the hepatic functions of market gardeners in the Western Region of Cameroon by studying some biochemical parameters. Sixty six male market gardeners in Foumbot, Massangam, and Bantoum were interviewed on their health status, habits and pesticide use in agriculture, including the spray frequency, application method, and pesticide dosage. Thirty men with no history of pesticide exposure were recruited as control group. Thereafter, their blood samples were collected for assessment of hepatic function biomarkers (ALT, AST, and albumin). The results showed that 56 pesticides containing 25 active ingredients were currently used by market gardeners enrolled in our study and most of their symptoms (headache, fatigue, skin rashes, eye irritation, and nausea) were related to the use of these chemicals. Compared to the control subjects market gardeners’ ALT levels (32.9 ± 7.19 UL-1 vs. 82.11 ± 35.40 UL-1; P < 0.001) and, AST levels (40.63 ± 6.52 UL-1 vs. 112.11 UL-1 ± 47.15 UL-1; P < 0.001) were significantly increased. These results suggest that liver function tests can be used as biomarkers to indicate toxicity before overt clinical signs occur. The market gardeners’ chronic exposure to pesticides due to poor application measures could lead to hepatic function impairment. Further research on larger scale is needed to confirm these findings and to establish a mechanism of toxicity.Keywords: biomarkers, liver, pesticides, occupational exposure
Procedia PDF Downloads 32010605 Degree of Approximation of Functions Conjugate to Periodic Functions Belonging to Lipschitz Classes by Product Matrix Means
Authors: Smita Sonker
Abstract:
Various investigators have determined the degree of approximation of conjugate signals (functions) of functions belonging to different classes Lipα, Lip(α,p), Lip(ξ(t),p), W(Lr,ξ(t), (β ≥ 0)) by matrix summability means, lower triangular matrix operator, product means (i.e. (C,1)(E,1), (C,1)(E,q), (E,q)(C,1) (N,p,q)(E,1), and (E,q)(N,pn) of their conjugate trigonometric Fourier series. In this paper, we shall determine the degree of approximation of 2π-periodic function conjugate functions of f belonging to the function classes Lipα and W(Lr; ξ(t); (β ≥ 0)) by (C1.T) -means of their conjugate trigonometric Fourier series. On the other hand, we shall review above-mentioned work in the light of Lenski.Keywords: signals, trigonometric fourier approximation, class W(L^r, \xi(t), conjugate fourier series
Procedia PDF Downloads 39710604 The Study and the Use of the Bifunctional Catalyst Pt/Re for Obtaining High Octane Number of the Gasoline
Authors: Menouar Hanafi
Abstract:
The original function of the process of platforming is to develop heavy naphtha (HSRN), coming from the atmospheric unit of distillation with a weak octane number (NO=44), to obtain a mixture of fuels â number octane raised by catalytically supporting specific groups of chemical reactions. The installation is divided into two sections: Section hydrobon. Section platforming. The rafinat coming from the bottom of column 12C2 to feed the section platforming, is divided into two parts whose flows are controlled and mixed with gas rich in hydrogen. Bottom of the column, we obtain stabilized reformat which is aspired by there pump to ensure the heating of the column whereas a part is sent towards storage after being cooled by the air cooler and the condenser. In catalytic catalyst of reforming, there is voluntarily associated a hydrogenating function-dehydrogenating, brought by platinum deposited, with an acid function brought by the alumina support (Al 2 0 3). The mechanism of action of this bifunctional catalyst depends on the severity of the operation, of the quality of the load and the type of catalyst. The catalyst used in the catalytic process of reforming is a very elaborate bifunctional catalyst whose performances are constantly improved thanks to the experimental research supported on an increasingly large comprehension of the phenomena. The American company Universel 0i1 petroleum (UOP) marketed several series of bimetallic catalysts such as R16, R20, R30, and R62 consisted Platinum/Rhenium on an acid support consisted the alumina added with a halogenous compound (chlorine).Keywords: platforming, amelioration, octane number, catalyst
Procedia PDF Downloads 38610603 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models
Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini
Abstract:
The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion
Procedia PDF Downloads 14010602 The Incidence of Metabolic Syndrome in Women with Impaired Reproductive Function According to Astana, Kazakhstan
Authors: A. T. Nakysh, A. S. Idrisov, S. A. Baidurin
Abstract:
This work presents the results of a study the incidence of metabolic syndrome (MetS) in women with impaired reproductive function (IRF) according to the data of Astana, Kazakhstan. The anthropometric, biochemical and instrumental studies were conducted among 515 women, of which 53 patients with MetS according to IDF criteria, 2006, were selected. The frequency of occurrence of the IRF, due to MetS – 10.3% of cases according to the data of Astana. In women of childbearing age with IRF and the MetS, blood pressure (BP), indicators of carbohydrate and lipid metabolism were significantly higher and the level of high density lipoprotein (HDL) significantly lower compared to the same in women with the IRF without MetS. The hyperandrogenism, the hyperestrogenemia, the hyperprolactinemia and the hypoprogesteronemia were found in the patients with MetS and IRF, indicating the impact of MetS on the development of the polycystic ovary syndrome in 28% of cases and hyperplastic processes of the myometrium in 20% of cases.Keywords: dyslipidemia, insulin resistance, metabolic syndrome, reproductive disorders, obesity
Procedia PDF Downloads 32310601 Does Mirror Therapy Improve Motor Recovery After Stroke? A Meta-Analysis of Randomized Controlled Trials
Authors: Hassan Abo Salem, Guo Feng, Xiaolin Huang
Abstract:
The objective of this study is to determine the effectiveness of mirror therapy on motor recovery and functional abilities after stroke. The following databases were searched from inception to May 2014: Cochrane Stroke, Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL, AMED, PsycINFO, and PEDro. Two reviewers independently screened and selected all randomized controlled trials that evaluate the effect of mirror therapy in stroke rehabilitation.12 randomized controlled trials studies met the inclusion criteria; 10 studies utilized the effect of mirror therapy for the upper limb and 2 studies for the lower limb. Mirror therapy had a positive effect on motor recover and function; however, we found no consistent influence on activity of daily living, Spasticity and balance. This meta-analysis suggests that, Mirror therapy has additional effect on motor recovery but has a small positive effect on functional abilities after stroke. Further high-quality studies with greater statistical power are required in order to accurately determine the effectiveness of mirror therapy following stroke.Keywords: mirror therapy, motor recovery, stroke, balance
Procedia PDF Downloads 55210600 Understanding the Interplay between Consumer Knowledge, Trust and Relationship Satisfaction in Financial Services
Authors: Torben Hansen, Lars Gronholdt, Alexander Josiassen, Anne Martensen
Abstract:
Consumers often exhibit a bias in their knowledge; they often think that they know more or less than they do. The concept of 'knowledge over/underconfidence' (O/U) has in previous studies been used to investigate such knowledge bias. O/U appears as a combination of subjective and objective knowledge. Subjective knowledge relates to consumers’ perception of their knowledge, while objective knowledge relates to consumers’ absolute knowledge measured by objective standards. This separation leads to three scenarios: The consumer can either be knowledge calibrated (subjective and objective knowledge are similar), overconfident (subjective knowledge exceeds objective knowledge) or underconfident (objective knowledge exceeds subjective knowledge). Knowledge O/U is a highly useful concept in understanding consumer choice behavior. For example, knowledge overconfident individuals are likely to exaggerate their ability to make right choices, are more likely to opt out of necessary information search, spend less time to carry out a specific task than less knowledge confident consumers, and are more likely to show high financial trading volumes. Through the use of financial services as a case study, this study contributes to previous research by examining how consumer knowledge O/U affects two types of trust (broad-scope trust and narrow-scope trust) and consumer relationship satisfaction. Trust does not only concern consumer trust in individual companies (i.e., narrow.-scope confidence NST), but also concerns consumer confidence in the broader business context in which consumers plan and implement their behavior (i.e., broad scope trust, BST). NST is defined as "the expectation that the service provider can be relied on to deliver on its promises’, while BST is defined as ‘the expectation that companies within a particular business type can generally be relied on to deliver on their promises.’ This study expands our understanding of the interplay between consumer knowledge bias, consumer trust, and relationship marketing in two main ways: First, it is demonstrated that the more knowledge O/U a consumer becomes, the higher/lower NST and levels of relationship satisfaction will be. Second, it is demonstrated that BST has a negative moderating effect on the relationship between knowledge O/U and satisfaction, such that knowledge O/U has a higher positive/negative effect on relationship satisfaction when BST is low vs. high. The data for this study comprises 756 mutual fund investors. Trust is particularly important in consumers’ mutual fund behavior because mutual funds have important responsibilities in providing financial advice and in managing consumers’ funds.Keywords: knowledge, cognitive bias, trust, customer-seller relationships, financial services
Procedia PDF Downloads 30110599 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 26310598 A Column Generation Based Algorithm for Airline Cabin Crew Rostering Problem
Authors: Nan Xu
Abstract:
In airlines, the crew scheduling problem is usually decomposed into two stages: crew pairing and crew rostering. In the crew pairing stage, pairings are generated such that each flight is covered by exactly one pairing and the overall cost is minimized. In the crew rostering stage, the pairings generated in the crew pairing stage are combined with off days, training and other breaks to create individual work schedules. The paper focuses on cabin crew rostering problem, which is challenging due to the extremely large size and the complex working rules involved. In our approach, the objective of rostering consists of two major components. The first is to minimize the number of unassigned pairings and the second is to ensure the fairness to crew members. There are two measures of fairness to crew members, the number of overnight duties and the total fly-hour over a given period. Pairings should be assigned to each crew member so that their actual overnight duties and fly hours are as close to the expected average as possible. Deviations from the expected average are penalized in the objective function. Since several small deviations are preferred than a large deviation, the penalization is quadratic. Our model of the airline crew rostering problem is based on column generation. The problem is decomposed into a master problem and subproblems. The mater problem is modeled as a set partition problem and exactly one roster for each crew is picked up such that the pairings are covered. The restricted linear master problem (RLMP) is considered. The current subproblem tries to find columns with negative reduced costs and add them to the RLMP for the next iteration. When no column with negative reduced cost can be found or a stop criteria is met, the procedure ends. The subproblem is to generate feasible crew rosters for each crew member. A separate acyclic weighted graph is constructed for each crew member and the subproblem is modeled as resource constrained shortest path problems in the graph. Labeling algorithm is used to solve it. Since the penalization is quadratic, a method to deal with non-additive shortest path problem using labeling algorithm is proposed and corresponding domination condition is defined. The major contribution of our model is: 1) We propose a method to deal with non-additive shortest path problem; 2) Operation to allow relaxing some soft rules is allowed in our algorithm, which can improve the coverage rate; 3) Multi-thread techniques are used to improve the efficiency of the algorithm when generating Line-of-Work for crew members. Here a column generation based algorithm for the airline cabin crew rostering problem is proposed. The objective is to assign a personalized roster to crew member which minimize the number of unassigned pairings and ensure the fairness to crew members. The algorithm we propose in this paper has been put into production in a major airline in China and numerical experiments show that it has a good performance.Keywords: aircrew rostering, aircrew scheduling, column generation, SPPRC
Procedia PDF Downloads 14610597 Evidence on Scale Economies in National Bank of Pakistan
Authors: Sohail Zafar, Sardar Javaid Iqbal Khan
Abstract:
We use a parametric approach within a translog cost function framework to estimate the economies of scale in National Bank of Pakistan from 1997 to 2013. The results indicate significant economies of scale throughout the sample at aggregates and disaggregates taking in account size subject to stipulation ownership. The factor markets often produce scale inefficiencies in the banking of developing countries like Pakistan such inefficiencies are common due to distortion in factor markets leading to the use of inappropriate factor proportions. The findings suggest that National Bank of Pakistan diversify their asset portfolios that it has cost advantage, therefore, expansion in size should be encouraged under current technology because it appears to be cost effective. In addition, our findings support the implementation of universal banking model in Pakistan.Keywords: scale economies, cost function, disaggregates, aggregates
Procedia PDF Downloads 32610596 Empirical Mode Decomposition Based Denoising by Customized Thresholding
Authors: Wahiba Mohguen, Raïs El’hadi Bekka
Abstract:
This paper presents a denoising method called EMD-Custom that was based on Empirical Mode Decomposition (EMD) and the modified Customized Thresholding Function (Custom) algorithms. EMD was applied to decompose adaptively a noisy signal into intrinsic mode functions (IMFs). Then, all the noisy IMFs got threshold by applying the presented thresholding function to suppress noise and to improve the signal to noise ratio (SNR). The method was tested on simulated data and real ECG signal, and the results were compared to the EMD-Based signal denoising methods using the soft and hard thresholding. The results showed the superior performance of the proposed EMD-Custom denoising over the traditional approach. The performances were evaluated in terms of SNR in dB, and Mean Square Error (MSE).Keywords: customized thresholding, ECG signal, EMD, hard thresholding, soft-thresholding
Procedia PDF Downloads 30210595 Assessing Walkability in New Cities around Cairo
Authors: Lobna Ahmed Galal
Abstract:
Modal integration has given minimal consideration in cities of developing countries, as well as the declining dominance of public transport, and predominance of informal transport, the modal share of informal taxis in greater Cairo has increased from 6% in 1987 to 37% in 2001 and this has since risen even higher, informal and non-motorized modes of transport acting as a gap filler by feeding other modes of transport, not by design or choice, but often by lack of accessible and affordable public transport. Yet non-motorized transport is peripheral, with minimal priority in urban planning and investments, lacking of strong polices to support non-motorized transport, for authorities development is associated with technology and motorized transport, and promotion of non-motorized transport may be considered corresponding to development, as well as social stigma against non-motorized transport, as it is seen a travel mode for the poor. Cairo as a city of a developing country, has poor quality infrastructure for non-motorized transport, suffering from absence of dedicated corridors, and when existing they are often encroached for commercial purposes, widening traffic lanes at the expense of sidewalks, absence of footpaths, or being overcrowded, poor lighting, making walking unsafe and yet, lack of financial supply to such facilities as it is often considered beyond city capabilities. This paper will deal with the objective measuring of the built environment relating to walking, in some neighborhoods of new cities around Cairo, In addition to comparing the results of the objective measures of the built environment with the level of self-reported survey. The first paper's objective is to show how the index ‘walkability of community neighborhoods’ works in the contexts in neighborhoods of new cities around Cairo. The procedure of objective measuring is of a high potential to be carried out by using GIS.Keywords: assessing, built environment, Cairo, walkability
Procedia PDF Downloads 38310594 Association of Airborne Emissions with Pulmonary Dysfunction, XRCC1 Gene Polymorphism, and Some Inflammatory Markers in Aluminum Workers
Authors: Gehan Moubarz, Atef M. F. Mohammed, Inas A. Saleh, Heba Mahdy-Abdallah, Amal Saad-Hussein
Abstract:
This study estimates the association between respiratory outcomes among employees of a secondary aluminum plant and airborne pollutants. Additionally, it looks into the relationship between pulmonary dysfunction in workers and XRCC1 gene polymorphisms. 110 exposed workers and 58 non-exposed workers participated in the study. Measurements have been conducted on SO₂, NO₂, and particulate particles. Pulmonary function was tested. Eosinophil cationic protein (ECP), C-reactive protein (CRP), matrix metalloproteinase-1 (MMP-1), interleukin 6 (IL6), GM-CSF, X-Ray Repair Cross Complementing 1 (XRCC1) protein, and genotyping of XRCC1 gene polymorphisms were examined. Results: The annual average concentrations of (PM₂.₅, PM₁₀, TSP, SO₂, and NO₂) were lower than the permissible limit. The areas around ovens, evaporators, and cold rolling mills exhibited the highest amounts. The majority of employees in these departments had impaired lung function. With longer exposure times, the exposed group's FEV1% and FVC% considerably reduced. The exposed workers had considerably higher XRCC1 levels. The evaluated inflammatory biomarkers showed no statistically significant difference. Conclusion: Aluminum workers are at risk of developing respiratory disorders. The level of serum XRCC1 may act as a biomarker that might be very useful for detecting susceptible workers.Keywords: aluminum industry, particulate matter, SO₂, NO₂, lung function, XRCC1 gene polymorphism, XRCC1 protein, inflammatory biomarkers
Procedia PDF Downloads 1110593 Finding Optimal Operation Condition in a Biological Nutrient Removal Process with Balancing Effluent Quality, Economic Cost and GHG Emissions
Authors: Seungchul Lee, Minjeong Kim, Iman Janghorban Esfahani, Jeong Tai Kim, ChangKyoo Yoo
Abstract:
It is hard to maintain the effluent quality of the wastewater treatment plants (WWTPs) under with fixed types of operational control because of continuously changed influent flow rate and pollutant load. The aims of this study is development of multi-loop multi-objective control (ML-MOC) strategy in plant-wide scope targeting four objectives: 1) maximization of nutrient removal efficiency, 2) minimization of operational cost, 3) maximization of CH4 production in anaerobic digestion (AD) for CH4 reuse as a heat source and energy source, and 4) minimization of N2O gas emission to cope with global warming. First, benchmark simulation mode is modified to describe N2O dynamic in biological process, namely benchmark simulation model for greenhouse gases (BSM2G). Then, three types of single-loop proportional-integral (PI) controllers for DO controller, NO3 controller, and CH4 controller are implemented. Their optimal set-points of the controllers are found by using multi-objective genetic algorithm (MOGA). Finally, multi loop-MOC in BSM2G is implemented and evaluated in BSM2G. Compared with the reference case, the ML-MOC with the optimal set-points showed best control performances than references with improved performances of 34%, 5% and 79% of effluent quality, CH4 productivity, and N2O emission respectively, with the decrease of 65% in operational cost.Keywords: Benchmark simulation model for greenhouse gas, multi-loop multi-objective controller, multi-objective genetic algorithm, wastewater treatment plant
Procedia PDF Downloads 50310592 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second
Authors: P. V. Pramila , V. Mahesh
Abstract:
Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients esulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF 25, PEF,FEF 25-75, FEF50, and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF 25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects). It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.Keywords: FEV, multivariate adaptive regression splines pulmonary function test, random forest
Procedia PDF Downloads 31010591 Audit Is a Production Performance Tool
Authors: Lattari Samir
Abstract:
The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.Keywords: audit, performance of process, independent examination, management tools, audit of accounts
Procedia PDF Downloads 7510590 The Impacts of Cost Stickiness on the Profitability of Indonesian Firms
Authors: Dezie L. Warganegara, Dewi Tamara
Abstract:
The objectives of this study are to investigate the existence of the sticky cost behaviour of firms listed in the Indonesia Stock Exchange (IDX) and to find an evidence on the effects of sticky operating expenses (SG&A expenses) on profitability of firms. For the first objective, this study found that the sticky cost behaviour does exist. For the second objective, this study finds that the stickier the operating expenses the less future profitability of the firms. This study concludes that sticky cost affects negatively to the performance and, therefore, firms should include flexibility in designing the cost structure of their firms.Keywords: sticky costs, Indonesia Stock Exchange (IDX), profitability, operating expenses, SG&A
Procedia PDF Downloads 31710589 Strategic Asset Allocation Optimization: Enhancing Portfolio Performance Through PCA-Driven Multi-Objective Modeling
Authors: Ghita Benayad
Abstract:
Asset allocation, which affects the long-term profitability of portfolios by distributing assets to fulfill a range of investment objectives, is the cornerstone of investment management in the dynamic and complicated world of financial markets. This paper offers a technique for optimizing strategic asset allocation with the goal of improving portfolio performance by addressing the inherent complexity and uncertainty of the market through the use of Principal Component Analysis (PCA) in a multi-objective modeling framework. The study's first section starts with a critical evaluation of conventional asset allocation techniques, highlighting how poorly they are able to capture the intricate relationships between assets and the volatile nature of the market. In order to overcome these challenges, the project suggests a PCA-driven methodology that isolates important characteristics influencing asset returns by decreasing the dimensionality of the investment universe. This decrease provides a stronger basis for asset allocation decisions by facilitating a clearer understanding of market structures and behaviors. Using a multi-objective optimization model, the project builds on this foundation by taking into account a number of performance metrics at once, including risk minimization, return maximization, and the accomplishment of predetermined investment goals like regulatory compliance or sustainability standards. This model provides a more comprehensive understanding of investor preferences and portfolio performance in comparison to conventional single-objective optimization techniques. While applying the PCA-driven multi-objective optimization model to historical market data, aiming to construct portfolios better under different market situations. As compared to portfolios produced from conventional asset allocation methodologies, the results show that portfolios optimized using the proposed method display improved risk-adjusted returns, more resilience to market downturns, and better alignment with specified investment objectives. The study also looks at the implications of this PCA technique for portfolio management, including the prospect that it might give investors a more advanced framework for navigating financial markets. The findings suggest that by combining PCA with multi-objective optimization, investors may obtain a more strategic and informed asset allocation that is responsive to both market conditions and individual investment preferences. In conclusion, this capstone project improves the field of financial engineering by creating a sophisticated asset allocation optimization model that integrates PCA with multi-objective optimization. In addition to raising concerns about the condition of asset allocation today, the proposed method of portfolio management opens up new avenues for research and application in the area of investment techniques.Keywords: asset allocation, portfolio optimization, principle component analysis, multi-objective modelling, financial market
Procedia PDF Downloads 4710588 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition
Authors: Latha Subbiah, Dhanalakshmi Samiappan
Abstract:
In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.Keywords: curvelet, decomposition, levelset, ultrasound
Procedia PDF Downloads 34010587 The Market Structure Simulation of Heterogenous Firms
Authors: Arunas Burinskas, Manuela Tvaronavičienė
Abstract:
Although the new trade theories, unlike the theories of an industrial organisation, see the structure of the market and competition between enterprises through their heterogeneity according to various parameters, they do not pay any particular attention to the analysis of the market structure and its development. In this article, although we relied mainly on models developed by the scholars of new trade theory, we proposed a different approach. In our simulation model, we model market demand according to normal distribution function, while on the supply side (as it is in the new trade theory models), productivity is modeled with the Pareto distribution function. The results of the simulation show that companies with higher productivity (lower marginal costs) do not pass on all the benefits of such economies to buyers. However, even with higher marginal costs, firms can choose to offer higher value-added goods to stay in the market. In general, the structure of the market is formed quickly enough and depends on the skills available to firms.Keywords: market, structure, simulation, heterogenous firms
Procedia PDF Downloads 14910586 Global Analysis in a Growth Economic Model with Perfect-Substitution Technologies
Authors: Paolo Russu
Abstract:
The purpose of the present paper is to highlight some features of an economic growth model with environmental negative externalities, giving rise to a three-dimensional dynamic system. In particular, we show that the economy, which is based on a Perfect-Substitution Technologies function of production, has no neither indeterminacy nor poverty trap. This implies that equilibrium select by economy depends on the history (initial values of state variable) of the economy rather than on expectations of economies agents. Moreover, by contrast, we prove that the basin of attraction of locally equilibrium points may be very large, as they can extend up to the boundary of the system phase space. The infinite-horizon optimal control problem has the purpose of maximizing the representative agent’s instantaneous utility function depending on leisure and consumption.Keywords: Hopf bifurcation, open-access natural resources, optimal control, perfect-substitution technologies, Poincarè compactification
Procedia PDF Downloads 17210585 Chronic Hypertension, Aquaporin and Hydraulic Conductivity: A Perspective on Pathological Connections
Authors: Chirag Raval, Jimmy Toussaint, Tieuvi Nguyen, Hadi Fadaifard, George Wolberg, Steven Quarfordt, Kung-ming Jan, David S. Rumschitzki
Abstract:
Numerous studies examine aquaporins’ role in osmotic water transport in various systems but virtually none focus on aquaporins’ role in hydrostatically-driven water transport involving mammalian cells save for our laboratory’s recent study of aortic endothelial cells. Here we investigate aquaporin-1 expression and function in the aortic endothelium in two high-renin rat models of hypertension, the spontaneously hypertensive genomically altered Wystar-Kyoto rat variant and Sprague-Dawley rats made hypertensive by two kidney, one clip Goldblatt surgery. We measured aquaporin-1 expression in aortic endothelial cells from whole rat aortas by quantitative immunohistochemistry, and function by measuring the pressure driven hydraulic conductivities of excised rat aortas with both intact and denuded endothelia on the same vessel. We use them to calculate the effective intimal hydraulic conductivity, which is a combination of endothelial and subendothelial components. We observed well-correlated enhancements in aquaporin-1 expression and function in both hypertensive rat models as well as in aortas from normotensive rats whose expression was upregulated by 2h forskolin treatment. Upregulated aquaporin-1 expression and function may be a response to hypertension that critically determines conduit artery vessel wall viability and long-term susceptibility to atherosclerosis. Numerous studies examine aquaporins’ role in osmotic water transport in various systems but virtually none focus on aquaporins’ role in hydrostatically-driven water transport involving mammalian cells save for our laboratory’s recent study of aortic endothelial cells. Here we investigate aquaporin-1 expression and function in the aortic endothelium in two high-renin rat models of hypertension, the spontaneously hypertensive genomically altered Wystar-Kyoto rat variant and Sprague-Dawley rats made hypertensive by two kidney, one clip Goldblatt surgery. We measured aquaporin-1 expression in aortic endothelial cells from whole rat aortas by quantitative immunohistochemistry, and function by measuring the pressure driven hydraulic conductivities of excised rat aortas with both intact and denuded endothelia on the same vessel. We use them to calculate the effective intimal hydraulic conductivity, which is a combination of endothelial and subendothelial components. We observed well-correlated enhancements in aquaporin-1 expression and function in both hypertensive rat models as well as in aortas from normotensive rats whose expression was upregulated by 2h forskolin treatment. Upregulated aquaporin-1 expression and function may be a response to hypertension that critically determines conduit artery vessel wall viability and long-term susceptibility to atherosclerosis.Keywords: acute hypertension, aquaporin-1, hydraulic conductivity, hydrostatic pressure, aortic endothelial cells, transcellular flow
Procedia PDF Downloads 23210584 A Calibration Method of Portable Coordinate Measuring Arm Using Bar Gauge with Cone Holes
Authors: Rim Chang Hyon, Song Hak Jin, Song Kwang Hyok, Jong Ki Hun
Abstract:
The calibration of the articulated arm coordinate measuring machine (AACMM) is key to improving calibration accuracy and saving calibration time. To reduce the time consumed for calibration, we should choose the proper calibration gauges and develop a reasonable calibration method. In addition, we should get the exact optimal solution by accurately removing the rough errors within the experimental data. In this paper, we present a calibration method of the portable coordinate measuring arm (PCMA) using the 1.2m long bar guage with cone-holes. First, we determine the locations of the bar gauge and establish an optimal objective function for identifying the structural parameter errors. Next, we make a mathematical model of the calibration algorithm and present a new mathematical method to remove the rough errors within calibration data. Finally, we find the optimal solution to identify the kinematic parameter errors by using Levenberg-Marquardt algorithm. The experimental results show that our calibration method is very effective in saving the calibration time and improving the calibration accuracy.Keywords: AACMM, kinematic model, parameter identify, measurement accuracy, calibration
Procedia PDF Downloads 83