Search results for: facility performance evaluation
2677 Urban Transport Demand Management Multi-Criteria Decision Using AHP and SERVQUAL Models: Case Study of Nigerian Cities
Authors: Suleiman Hassan Otuoze, Dexter Vernon Lloyd Hunt, Ian Jefferson
Abstract:
Urbanization has continued to widen the gap between demand and resources available to provide resilient and sustainable transport services in many fast-growing developing countries' cities. Transport demand management is a decision-based optimization concept for both benchmarking and ensuring efficient use of transport resources. This study assesses the service quality of infrastructure and mobility services in the Nigerian cities of Kano and Lagos through five dimensions of quality (i.e., Tangibility, Reliability, Responsibility, Safety Assurance and Empathy). The methodology adopts a hybrid AHP-SERVQUAL model applied on questionnaire surveys to gauge the quality of satisfaction and the views of experts in the field. The AHP results prioritize tangibility, which defines the state of transportation infrastructure and services in terms of satisfaction qualities and intervention decision weights in the two cities. The results recorded ‘unsatisfactory’ indices of quality of performance and satisfaction rating values of 48% and 49% for Kano and Lagos, respectively. The satisfaction indices are identified as indicators of low performances of transportation demand management (TDM) measures and the necessity to re-order priorities and take proactive steps towards infrastructure. The findings pilot a framework for comparative assessment of recognizable standards in transport services, best ethics of management and a necessity of quality infrastructure to guarantee both resilient and sustainable urban mobility.Keywords: transportation demand management, multi-criteria decision support, transport infrastructure, service quality, sustainable transport
Procedia PDF Downloads 2242676 'Antibody Exception' under Dispute and Waning Usage: Potential Influence on Patenting Antibodies
Authors: Xiangjun Kong, Dongning Yao, Yuanjia Hu
Abstract:
Therapeutic antibodies have become the most valuable and successful class of biopharmaceutical drugs, with a huge market potential and therapeutic advantages. Antibody patents are, accordingly, extremely important. As the technological limitation of the early stage of this field, the U. S. Patent and Trademark Offices (USPTO) have issued guidelines that suggest an exception for patents claiming a genus of antibodies that bind to a novel antigen, even in the absence of any experimental antibody production. This 'antibody exception' allowed for a broad scope on antibody claims, and led a global trend to patent antibodies without antibodies. Disputes around the pertinent patentability and written description issues remain particularly intense. Yet the validity of such patents had not been overtly challenged until Centocor v. Abbott, which restricted the broad scope of antibody patents and hit the brakes on the 'antibody exception'. The courts tend to uphold the requirement for adequate description of antibodies in the patent specifications, to avoid overreaching antibody claims. Patents following the 'antibody exception' are at risk of being found invalid for inadequately describing what they have claimed. However, the relation between the court and USPTO guidelines remains obscure, and the waning of the 'antibody exception' has led to further disputes around antibody patents. This uncertainty clearly affects patent applications, antibody innovations, and even relevant business performance. This study will give an overview of the emergence, debate, and waning usage of the 'antibody exception' in a number of enlightening cases, attempting to understand the specific concerns and the potential influence of antibody patents. We will then provide some possible strategies for antibody patenting, under the current considerations on the 'antibody exception'.Keywords: antibody exception, antibody patent, USPTO (U. S. Patent and Trademark Offices) guidelines, written description requirement
Procedia PDF Downloads 1592675 Revitalization of Sign Language through Deaf Theatre: A Linguistic Analysis of an Art Form Which Combines Physical Theatre, Poetry, and Sign Language
Authors: Gal Belsitzman, Rose Stamp, Atay Citron, Wendy Sandler
Abstract:
Sign languages are considered endangered. The vitality of sign languages is compromised by its unique sociolinguistic situation, in which hearing parents that give birth to deaf children usually decide to cochlear implant their child. Therefore, these children don’t acquire their natural language – Sign Language. Despite this, many sign languages, such as Israeli Sign Language (ISL) are thriving. The continued survival of similar languages under threat has been associated with the remarkable resilience of the language community. In particular, deaf literary traditions are central in reminding the community of the importance of the language. One example of a deaf literary tradition which has received increased popularity in recent years is deaf theatre. The Ebisu Sign Language Theatre Laboratory, developed as part of the multidisciplinary Grammar of the Body Research Project, is the first deaf theatre company in Israel. Ebisu Theatre combines physical theatre and sign language research, to allow for a natural laboratory to analyze the creative use of the body. In this presentation, we focus on the recent theatre production called ‘Their language’ which tells of the struggle faced by the deaf community to use their own natural language in the education system. A thorough analysis unravels how linguistic properties are integrated with the use of poetic devices and physical theatre techniques in this performance, enabling wider access by both deaf and hearing audiences, without interpretation. Interviews with the audience illustrate the significance of this art form which serves a dual purpose, both as empowering for the deaf community and educational for the hearing and deaf audiences, by raising awareness of community-related issues.Keywords: deaf theatre, empowerment, language revitalization, sign language
Procedia PDF Downloads 1692674 Relationship between Strategic Management and Organizational Culture in Sport Organization (Case Study: Selected Sport Federations of Islamic Republic of Iran)
Authors: Mohammad Ali Ghareh, Habib Honari, Alireza Ahmadi
Abstract:
The aim of this study was to investigate the relationship between strategic management and organizational culture in sport federations of Islamic Republic of Iran. Strategic management is a set of decisions and actions which define the long term performance of an organization. Organizational culture can be considered as an identity for every organization and somehow gives an identification to organization members. Organizational culture result in a certain commitments in organization members which is more valuable than individual profits and interests. The method of research was descriptive and correlational, conducted as a field study. The statistical population consisted of the employees of 10 sports federations and 170 persons were selected as sample. For data gathering, Barringer and Bluedorn’s strategic management questionnaire (1999) and Sakyn’s organizational culture questionnaire (2001) were used. The reliability of the questionnaires were 0.82 and 0.80 respectively, and the validity was approved by 8 experienced professors in sport management. To analyze data, KS (Kolmogorov–Smirnov) test and Pearson's coefficient were used. The results have shown that there is a significant meaningful relationship between strategic management and organizational culture (p < 0.05, r= 0.62). Beside this, there is a positive relationship between strategic management variables including scanning intensity, planning flexibility, locus of planning, planning horizon, strategic controls, and organizational culture (p < 0.05). Based on this research result it can be derived that strategic management planning and operation in terms of appropriate organizational culture is more applicable. By agreeing on their values and beliefs, adaptation to changes, caring about the individualities, coordination in tasks, modifying the individual and organizational goals, the federations will be able to achieve their strategic goals.Keywords: strategic management, organizational culture, sports federations, Islamic Republic of Iran
Procedia PDF Downloads 3742673 Setting Uncertainty Conditions Using Singular Values for Repetitive Control in State Feedback
Authors: Muhammad A. Alsubaie, Mubarak K. H. Alhajri, Tarek S. Altowaim
Abstract:
A repetitive controller designed to accommodate periodic disturbances via state feedback is discussed. Periodic disturbances can be represented by a time delay model in a positive feedback loop acting on system output. A direct use of the small gain theorem solves the periodic disturbances problem via 1) isolating the delay model, 2) finding the overall system representation around the delay model and 3) designing a feedback controller that assures overall system stability and tracking error convergence. This paper addresses uncertainty conditions for the repetitive controller designed in state feedback in either past error feedforward or current error feedback using singular values. The uncertainty investigation is based on the overall system found and the stability condition associated with it; depending on the scheme used, to set an upper/lower limit weighting parameter. This creates a region that should not be exceeded in selecting the weighting parameter which in turns assures performance improvement against system uncertainty. Repetitive control problem can be described in lifted form. This allows the usage of singular values principle in setting the range for the weighting parameter selection. The Simulation results obtained show a tracking error convergence against dynamic system perturbation if the weighting parameter chosen is within the range obtained. Simulation results also show the advantage of weighting parameter usage compared to the case where it is omitted.Keywords: model mismatch, repetitive control, singular values, state feedback
Procedia PDF Downloads 1552672 Cryptocurrency as a Payment Method in the Tourism Industry: A Comparison of Volatility, Correlation and Portfolio Performance
Authors: Shu-Han Hsu, Jiho Yoon, Chwen Sheu
Abstract:
With the rapidly growing of blockchain technology and cryptocurrency, various industries which include tourism has added in cryptocurrency as the payment method of their transaction. More and more tourism companies accept payments in digital currency for flights, hotel reservations, transportation, and more. For travellers and tourists, using cryptocurrency as a payment method has become a way to circumvent costs and prevent risks. Understanding volatility dynamics and interdependencies between standard currency and cryptocurrency is important for appropriate financial risk management to assist policy-makers and investors in marking more informed decisions. The purpose of this paper has been to understand and explain the risk spillover effects between six major cryptocurrencies and the top ten most traded standard currencies. Using data for the daily closing price of cryptocurrencies and currency exchange rates from 7 August 2015 to 10 December 2019, with 1,133 observations. The diagonal BEKK model was used to analyze the co-volatility spillover effects between cryptocurrency returns and exchange rate returns, which are measures of how the shocks to returns in different assets affect each other’s subsequent volatility. The empirical results show there are co-volatility spillover effects between the cryptocurrency returns and GBP/USD, CNY/USD and MXN/USD exchange rate returns. Therefore, currencies (British Pound, Chinese Yuan and Mexican Peso) and cryptocurrencies (Bitcoin, Ethereum, Ripple, Tether, Litecoin and Stellar) are suitable for constructing a financial portfolio from an optimal risk management perspective and also for dynamic hedging purposes.Keywords: blockchain, co-volatility effects, cryptocurrencies, diagonal BEKK model, exchange rates, risk spillovers
Procedia PDF Downloads 1432671 Investor Psychology, Housing Prices, and Stock Market Response to Policy Decisions During the Covid-19 Recession in the United States
Authors: Ly Nguyen, Vidit Munshi
Abstract:
During the Covid-19 recession, the United States government has implemented several instruments to mitigate the impacts and revitalize the economy. This paper explores the effects of the various government policy decisions on stock returns, housing prices, and investor psychology during the pandemic in the United States. A numerous previous literature studies on this subject, yet very few focus on the context similar to what we are currently experiencing. Our monthly data covering the period from January 2019 through July 2021 were collected from Datastream. Utilizing the VAR model, we document a dynamic relationship between the market and policy actions throughout the period. In particular, the movements of Unemployment, Stock returns, and Housing prices are strongly sensitive to changes in government policies. Our results also indicate that changes in production level, stock returns, and interest rates decisions influence how investors perceived future market risk and expectations. We do not find any significant nexus between monetary and fiscal policy. Our findings imply that information on government policy and stock market performance provide useful feedback to one another in order to make better decisions in the current and future pandemic. Understanding how the market responds to a shift in government practices has important implications for authorities in implementing policy to avoid assets bubbles and market overreactions. The paper also provides useful implications for investors in evaluating the effectiveness of different policies and diversifying portfolios to minimize systematic risk and maximize returns.Keywords: Covid-19 recession, United States, government policies, investor psychology, housing prices, stock market returns
Procedia PDF Downloads 1722670 Particle Observation in Secondary School Using a Student-Built Instrument: Design-Based Research on a STEM Sequence about Particle Physics
Authors: J.Pozuelo-Muñoz, E. Cascarosa-Salillas, C. Rodríguez-Casals, A. de Echave, E. Terrado-Sieso
Abstract:
This study focuses on the development, implementation, and evaluation of an instructional sequence aimed at 16–17-year-old students, involving the design and use of a cloud chamber—a device that allows observation of subatomic particles. The research addresses the limited presence of particle physics in Spanish secondary and high school curricula, a gap that restricts students' learning of advanced physics concepts and diminishes engagement with complex scientific topics. The primary goal of this project is to introduce particle physics in the classroom through a practical, interdisciplinary methodology that promotes autonomous learning and critical thinking. The methodology is framed within Design-Based Research (DBR), an approach that enables iterative and pragmatic development of educational resources. The research proceeded in several phases, beginning with the design of an experimental teaching sequence, followed by its implementation in high school classrooms. This sequence was evaluated, redesigned, and reimplemented with the aim of enhancing students’ understanding and skills related to designing and using particle detection instruments. The instructional sequence was divided into four stages: introduction to the activity, research and design of cloud chamber prototypes, observation of particle tracks, and analysis of collected data. In the initial stage, students were introduced to the fundamentals of the activity and provided with bibliographic resources to conduct autonomous research on cloud chamber functioning principles. During the design stage, students sourced materials and constructed their own prototypes, stimulating creativity and understanding of physics concepts like thermodynamics and material properties. The third stage focused on observing subatomic particles, where students recorded and analyzed the tracks generated in their chambers. Finally, critical reflection was encouraged regarding the instrument's operation and the nature of the particles observed. The results show that designing the cloud chamber motivates students and actively engages them in the learning process. Additionally, the use of this device introduces advanced scientific topics beyond particle physics, promoting a broader understanding of science. The study’s conclusions emphasize the need to provide students with ample time and space to thoroughly understand the role of materials and physical conditions in the functioning of their prototypes and to encourage critical analysis of the obtained data. This project not only highlights the importance of interdisciplinarity in science education but also provides a practical framework for teachers to adapt complex concepts for educational contexts where these topics are often absent.Keywords: cloud chamber, particle physics, secondary education, instructional design, design-based research, STEM
Procedia PDF Downloads 132669 The Effect of Female Access to Healthcare and Educational Attainment on Nigerian Agricultural Productivity Level
Authors: Esther M. Folarin, Evans Osabuohien, Ademola Onabote
Abstract:
Agriculture constitutes an important part of development and poverty mitigation in lower-middle-income countries, like Nigeria. The level of agricultural productivity in the Nigerian economy in line with the level of demand necessary to meet the desired expectation of the Nigerian populace is threatening to meeting the standard of the United Nations (UN) Sustainable Development Goals (SDGs); This includes the SDG-2 (achieve food security through agricultural productivity). The overall objective of the study is to reveal the performance of the interaction variable in the model among other factors that help in the achievement of greater Nigerian agricultural productivity. The study makes use of Wave 4 (2018/2019) of the Living Standard Measurement Studies, Integrated Survey on Agriculture (LSMS-ISA). Qualitative analysis of the information was also used to provide complimentary answers to the quantitative analysis done in the study. The study employed human capital theory and Grossman’s theory of health Demand in explaining the relationships that exist between the variables within the model of the study. The study engages the Instrumental Variable Regression technique in achieving the broad objectives among other techniques for the other specific objectives. The estimation results show that there exists a positive relationship between female healthcare and the level of female agricultural productivity in Nigeria. In conclusion, the study emphasises the need for more provision and empowerment for greater female access to healthcare and educational attainment levels that aids higher female agricultural productivity and consequently an improvement in the total agricultural productivity of the Nigerian economy.Keywords: agricultural productivity, education, female, healthcare, investment
Procedia PDF Downloads 812668 Copper Price Prediction Model for Various Economic Situations
Authors: Haidy S. Ghali, Engy Serag, A. Samer Ezeldin
Abstract:
Copper is an essential raw material used in the construction industry. During the year 2021 and the first half of 2022, the global market suffered from a significant fluctuation in copper raw material prices due to the aftermath of both the COVID-19 pandemic and the Russia-Ukraine war, which exposed its consumers to an unexpected financial risk. Thereto, this paper aims to develop two ANN-LSTM price prediction models, using Python, that can forecast the average monthly copper prices traded in the London Metal Exchange; the first model is a multivariate model that forecasts the copper price of the next 1-month and the second is a univariate model that predicts the copper prices of the upcoming three months. Historical data of average monthly London Metal Exchange copper prices are collected from January 2009 till July 2022, and potential external factors are identified and employed in the multivariate model. These factors lie under three main categories: energy prices and economic indicators of the three major exporting countries of copper, depending on the data availability. Before developing the LSTM models, the collected external parameters are analyzed with respect to the copper prices using correlation and multicollinearity tests in R software; then, the parameters are further screened to select the parameters that influence the copper prices. Then, the two LSTM models are developed, and the dataset is divided into training, validation, and testing sets. The results show that the performance of the 3-Month prediction model is better than the 1-Month prediction model, but still, both models can act as predicting tools for diverse economic situations.Keywords: copper prices, prediction model, neural network, time series forecasting
Procedia PDF Downloads 1132667 Molecular Simulation of NO, NH3 Adsorption in MFI and H-ZSM5
Authors: Z. Jamalzadeh, A. Niaei, H. Erfannia, S. G. Hosseini, A. S. Razmgir
Abstract:
Due to developing the industries, the emission of pollutants such as NOx, SOx, and CO2 are rapidly increased. Generally, NOx is attributed to the mono nitrogen oxides of NO and NO2 that is one of the most important atmospheric contaminants. Hence, controlling the emission of nitrogen oxides is urgent environmentally. Selective Catalytic Reduction of NOx is one of the most common techniques for NOx removal in which Zeolites have wide application due to their high performance. In zeolitic processes, the catalytic reaction occurs mostly in the pores. Therefore, investigation the adsorption phenomena of the molecules in order to gain an insight and understand the catalytic cycle is of important. Hence, in current study, molecular simulations is applied for studying the adsorption phenomena in nanocatalysts applied for SCR of NOx process. The effect of cation addition to the support in the catalysts’ behavior through adsorption step was explored by Mont Carlo (MC). Simulation time of 1 Ns accompanying 1 fs time step, COMPASS27 Force Field and the cut off radios of 12.5 Ȧ was applied for performed runs. It was observed that the adsorption capacity increases in the presence of cations. The sorption isotherms demonstrated the behavior of type I isotherm categories and sorption capacity diminished with increase in temperature whereas an increase was observed at high pressures. Besides, NO sorption showed higher sorption capacity than NH3 in H–ZSM5. In this respect, the Energy distributions signified that the molecules could adsorb in just one sorption site at the catalyst and the sorption energy of NO was stronger than the NH3 in H-ZSM5. Furthermore, the isosteric heat of sorption data showed nearly same values for the molecules; however, it indicated stronger interactions of NO molecules with H-ZSM5 Zeolite compared to the isosteric heat of NH3 which was low in value.Keywords: Monte Carlo simulation, adsorption, NOx, ZSM5
Procedia PDF Downloads 3782666 Design Study on a Contactless Material Feeding Device for Electro Conductive Workpieces
Authors: Oliver Commichau, Richard Krimm, Bernd-Arno Behrens
Abstract:
A growing demand on the production rate of modern presses leads to higher stroke rates. Commonly used material feeding devices for presses like grippers and roll-feeding systems can only achieve high stroke rates along with high gripper forces, to avoid stick-slip. These forces are limited by the sensibility of the surfaces of the workpieces. Stick-slip leads to scratches on the surface and false positioning of the workpiece. In this paper, a new contactless feeding device is presented, which develops higher feeding force without damaging the surface of the workpiece through gripping forces. It is based on the principle of the linear induction motor. A primary part creates a magnetic field and induces eddy currents in the electrically conductive material. A Lorentz-Force applies to the workpiece in feeding direction as a mutual reaction between the eddy-currents and the magnetic induction. In this study, the FEA model of this approach is shown. The calculation of this model was used to identify the influence of various design parameters on the performance of the feeder and thus showing the promising capabilities and limits of this technology. In order to validate the study, a prototype of the feeding device has been built. An experimental setup was used to measure pulling forces and placement accuracy of the experimental feeder in order to give an outlook of a potential industrial application of this approach.Keywords: conductive material, contactless feeding, linear induction, Lorentz-Force
Procedia PDF Downloads 1792665 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation
Procedia PDF Downloads 1532664 Survey of Prevalence of Noise Induced Hearing Loss in Hawkers and Shopkeepers in Noisy Areas of Mumbai City
Authors: Hitesh Kshayap, Shantanu Arya, Ajay Basod, Sachin Sakhuja
Abstract:
This study was undertaken to measure the overall noise levels in different locations/zones and to estimate the prevalence of Noise induced hearing loss in Hawkers & Shopkeepers in Mumbai, India. The Hearing Test developed by American Academy Of Otolaryngology, translated from English to Hindi, and validated is used as a screening tool for hearing sensitivity was employed. The tool is having 14 items. Each item is scored on a scale 0, 1, 2 and 3. The score 6 and above indicated some difficulty or definite difficulty in hearing in daily activities and low score indicated lesser difficulty or normal hearing. The subjects who scored 6 or above or having tinnitus were made to undergo hearing evaluation by Pure tone audiometer. Further, the environmental noise levels were measured from Morning to Evening at road side at different Location/Hawking zones in Mumbai city using SLM9 Agronic 8928B & K type Digital Sound Level Meter) in dB (A). The maximum noise level of 100.0 dB (A) was recorded during evening hours from Chattrapati Shivaji Terminal to Colaba with overall noise level of 79.0 dB (A). However, the minimum noise level in this area was 72.6 dB (A) at any given point of time. Further, 54.6 dB (A) was recorded as minimum noise level during 8-9 am at Sion Circle. Further, commencement of flyovers with 2-tier traffic, sky walks, increasing number of vehicular traffic at road, high rise buildings and other commercial & urbanization activities in the Mumbai city most probably have resulted in increasing the overall environmental noise levels. Trees which acted as noise absorbers have been cut owing to rapid construction. The study involved 100 participants in the age range of 18 to 40 years of age, with the mean age of 29 years (S.D. =6.49). 46 participants having tinnitus or have obtained the score of 6 were made to undergo Pure Tone Audiometry and it was found that the prevalence rate of hearing loss in hawkers & shopkeepers is 19% (10% Hawkers and 9 % Shopkeepers). The results found indicates that 29 (42.6%) out of 64 Hawkers and 17 (47.2%) out of 36 Shopkeepers who underwent PTA had no significant difference in percentage of Noise Induced Hearing loss. The study results also reveal that participants who exhibited tinnitus 19 (41.30%) out of 46 were having mild to moderate sensorineural hearing loss between 3000Hz to 6000Hz. The Pure tone Audiogram pattern revealed Hearing loss at 4000 Hz and 6000 Hz while hearing at adjacent frequencies were nearly normal. 7 hawkers and 8 shopkeepers had mild notch while 3 hawkers and 1 shopkeeper had a moderate degree of notch. It is thus inferred that tinnitus is a strong indicator for presence of hearing loss and 4/6 KHz notch is a strong marker for road/traffic/ environmental noise as an occupational hazard for hawkers and shopkeepers. Mass awareness about these occupational hazards, regular hearing check up, early intervention along with sustainable development juxtaposed with social and urban forestry can help in this regard.Keywords: NIHL, noise, sound level meter, tinnitus
Procedia PDF Downloads 2022663 Positive effect of Cu2+ and Ca2+ on the Thermostability of Bambara Groundnut Peroxidase A6, and its Catalytic Efficiency Toward the Oxidation of 3,3,5,5 -Tetramethyl Benzidine
Authors: Yves Mann Elate Lea Mbassi, Marie Solange Evehe Bebandoue, Wilfred Fon Mbacham
Abstract:
Improving the catalytic performance of enzymes has been a long-standing theme of analytical biochemistry research. Induction of peroxidase activity by metals is a common reaction in higher plants. We thought that this increase in peroxidase activity may be due, on the one hand, to the stimulation of the gene expression of these enzymes but also to a modification of their chemical reactivity following the binding of some metal ions on their active site. We tested the effect of some metal salts (MgCl₂, MnCl₂, ZnCl₂, CaCl₂ and CuSO₄) on the activity and thermostability of peroxidase A6, a thermostable peroxidase that we discovered and purified in a previous study. The chromogenic substrate used was 3,3′,5,5′-tetramethylbenzidine. Of all the metals tested for their effect on A6, only magnesium and copper had a significant effect on the activity of the enzyme at room temperature. The Mann-Whitney test shows a slight inhibitory effect of activity by the magnesium salt (P = 0.043), while the activity of the enzyme is 5 times higher in the presence of the copper salt (P = 0.002). Moreover, the thermostability of peroxidase A6 is increased when calcium and copper salts are present. The activity in the presence of CaCl₂ is 8 times higher than the residual activity of the enzyme alone after incubation at 80°C for 10 min and 35 times higher in the presence of CuSO4 under the same conditions. In addition, manganese and zinc salts slightly reduce the thermostability of the enzyme. The activity and structural stability of peroxidase A6 can clearly be activated by Cu₂+, which therefore enhance the oxidation of 3,3′,5,5′-tetramethylbenzidine, which was used in this study as a chromogenic substrate. Ca₂+ likely has a more stabilizing function for the catalytic site.Keywords: peroxidase activity, copper ions, calcium ions, thermostability
Procedia PDF Downloads 762662 The Effect of CPU Location in Total Immersion of Microelectronics
Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson
Abstract:
Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures
Procedia PDF Downloads 2722661 Sustainable Tourism a Challenge to Competitivity: OBSERVE Project
Authors: Rui Lança, Elisa Silva, Fátima Farinha, Miguel José Oliveira, Manuel Duarte Pinheiro, Cátia Miguel
Abstract:
Tourism has a great potential to bring up progress across the Sustainable Development Goals (SDGs). If well managed and monitored, the tourism sector can create quality jobs, reduce poorness and offer incentives for environmental preservation, helping on the transition towards more inclusive and resilient economies. However, without proper safeguards and investments, expansion of the tourism market will increase pressure on biodiversity and the ecosystems on which the livelihoods of local communities depend. Competitivity is a key dimension in tourism, sustainable tourism challenge new dimensions to competitivity, namely environmental, social, institutional and economic achieve a medium and long-term competitivity. It is undoubtedly clear on the tourism sector, the importance of the region sustainability in the current touristic destinations offer. The basis of a tourism region prosperity will depend on /of it. The OBSERVE project intends to be an instrument for monitoring and evaluating the sustainability levels of the Algarve region. Its main objective is to provide environmental, economic, social-cultural and institutional indicators to support the decision-making process for a sustainable growth of the region. The project´s main goal is a digital portal with the most relevant indicators to allow evaluating and communicating the performance of the region in a sustainable growth perspective. This paper presents the OBSERVE project and highlights the potential contribution to a broad perspective of competitivity and its contribution for different stakeholders and the touristic value chain. Limitations and opportunities are also discussed.Keywords: sustainable tourism, competitivity, OBSERVE project, Algarve region
Procedia PDF Downloads 1492660 Development of a One Health and Comparative Medicine Curriculum for Medical Students
Authors: Aliya Moreira, Blake Duffy, Sam Kosinski, Kate Heckman, Erika Steensma
Abstract:
Introduction: The One Health initiative promotes recognition of the interrelatedness between people, animals, plants, and their shared environment. The field of comparative medicine studies the similarities and differences between humans and animals for the purpose of advancing medical sciences. Currently, medical school education is narrowly focused on human anatomy and physiology, but as the COVID-19 pandemic has demonstrated, a holistic understanding of health requires comprehension of the interconnection between health and the lived environment. To prepare future physicians for unique challenges from emerging zoonoses to climate change, medical students can benefit from exposure to and experience with One Health and Comparative Medicine content. Methods: In January 2020, an elective course for medical students on One Health and Comparative Medicine was created to provide medical students with the background knowledge necessary to understand the applicability of animal and environmental health in medical research and practice. The 2-week course was continued in January 2021, with didactic and experiential activities taking place virtually due to the COVID-19 pandemic. In response to student feedback, lectures were added to expand instructional content on zoonotic and wildlife diseases for the second iteration of the course. Other didactic sessions included interprofessional lectures from 20 physicians, veterinarians, public health professionals, and basic science researchers. The first two cohorts of students were surveyed regarding One Health and Comparative Medicine concepts at the beginning and conclusion of the course. Results: 16 medical students have completed the comparative medicine course thus far, with 87.5% (n=14) completing pre-and post-course evaluations. 100% of student respondents indicated little to no exposure to comparative medicine or One Health concepts during medical school. Following the course, 100% of students felt familiar or very familiar with comparative medicine and One Health concepts. To assess course efficacy, questions were evaluated on a five-point Likert scale. 100% agreed or strongly agreed that learning Comparative Medicine and One Health topics augmented their medical education. 100% agreed or strongly agreed that a course covering this content should be regularly offered to medical students. Conclusions: Data from the student evaluation surveys demonstrate that the Comparative Medicine course was successful in increasing medical student knowledge of Comparative Medicine and One Health. Results also suggest that interprofessional training in One Health and Comparative Medicine is applicable and useful for medical trainees. Future iterations of this course could capitalize on the inherently interdisciplinary nature of these topics by enrolling students from veterinary and public health schools into a longitudinal course. Such recruitment may increase the course’s value by offering multidisciplinary student teams the opportunity to conduct research projects, thereby strengthening both the individual learning experience as well as sparking future interprofessional research ventures. Overall, these efforts to educate medical students in One Health topics should be reproducible at other institutions, preparing more future physicians for the diverse challenges they will encounter in practice.Keywords: medical education, interprofessional instruction, one health, comparative medicine
Procedia PDF Downloads 1082659 Forecasting Container Throughput: Using Aggregate or Terminal-Specific Data?
Authors: Gu Pang, Bartosz Gebka
Abstract:
We forecast the demand of total container throughput at the Indonesia’s largest seaport, Tanjung Priok Port. We propose four univariate forecasting models, including SARIMA, the additive Seasonal Holt-Winters, the multiplicative Seasonal Holt-Winters and the Vector Error Correction Model. Our aim is to provide insights into whether forecasting the total container throughput obtained by historical aggregated port throughput time series is superior to the forecasts of the total throughput obtained by summing up the best individual terminal forecasts. We test the monthly port/individual terminal container throughput time series between 2003 and 2013. The performance of forecasting models is evaluated based on Mean Absolute Error and Root Mean Squared Error. Our results show that the multiplicative Seasonal Holt-Winters model produces the most accurate forecasts of total container throughput, whereas SARIMA generates the worst in-sample model fit. The Vector Error Correction Model provides the best model fits and forecasts for individual terminals. Our results report that the total container throughput forecasts based on modelling the total throughput time series are consistently better than those obtained by combining those forecasts generated by terminal-specific models. The forecasts of total throughput until the end of 2018 provide an essential insight into the strategic decision-making on the expansion of port's capacity and construction of new container terminals at Tanjung Priok Port.Keywords: SARIMA, Seasonal Holt-Winters, Vector Error Correction Model, container throughput
Procedia PDF Downloads 5042658 Application of the State of the Art of Hydraulic Models to Manage Coastal Problems, Case Study: The Egyptian Mediterranean Coast Model
Authors: Al. I. Diwedar, Moheb Iskander, Mohamed Yossef, Ahmed ElKut, Noha Fouad, Radwa Fathy, Mustafa M. Almaghraby, Amira Samir, Ahmed Romya, Nourhan Hassan, Asmaa Abo Zed, Bas Reijmerink, Julien Groenenboom
Abstract:
Coastal problems are stressing the coastal environment due to its complexity. The dynamic interaction between the sea and the land results in serious problems that threaten coastal areas worldwide, in addition to human interventions and activities. This makes the coastal environment highly vulnerable to natural processes like flooding, erosion, and the impact of human activities as pollution. Protecting and preserving this vulnerable coastal zone with its valuable ecosystems calls for addressing the coastal problems. This, in the end, will support the sustainability of the coastal communities and maintain the current and future generations. Consequently applying suitable management strategies and sustainable development that consider the unique characteristics of the coastal system is a must. The coastal management philosophy aims to solve the conflicts of interest between human development activities and this dynamic nature. Modeling emerges as a successful tool that provides support to decision-makers, engineers, and researchers for better management practices. Modeling tools proved that it is accurate and reliable in prediction. With its capability to integrate data from various sources such as bathymetric surveys, satellite images, and meteorological data, it offers the possibility for engineers and scientists to understand this complex dynamic system and get in-depth into the interaction between both the natural and human-induced factors. This enables decision-makers to make informed choices and develop effective strategies for sustainable development and risk mitigation of the coastal zone. The application of modeling tools supports the evaluation of various scenarios by affording the possibility to simulate and forecast different coastal processes from the hydrodynamic and wave actions and the resulting flooding and erosion. The state-of-the-art application of modeling tools in coastal management allows for better understanding and predicting coastal processes, optimizing infrastructure planning and design, supporting ecosystem-based approaches, assessing climate change impacts, managing hazards, and finally facilitating stakeholder engagement. This paper emphasizes the role of hydraulic models in enhancing the management of coastal problems by discussing the diverse applications of modeling in coastal management. It highlights the modelling role in understanding complex coastal processes, and predicting outcomes. The importance of informing decision-makers with modeling results which gives technical and scientific support to achieve sustainable coastal development and protection.Keywords: coastal problems, coastal management, hydraulic model, numerical model, physical model
Procedia PDF Downloads 292657 Synthesis, Computational Studies, Antioxidant and Anti-Inflammatory Bio-Evaluation of 2,5-Disubstituted- 1,3,4-Oxadiazole Derivatives
Authors: Sibghat Mansoor Rana, Muhammad Islam, Hamid Saeed, Hummera Rafique, Muhammad Majid, Muhammad Tahir Aqeel, Fariha Imtiaz, Zaman Ashraf
Abstract:
The 1,3,4-oxadiazole derivatives Ox-6a-f have been synthesized by incorporating flur- biprofen moiety with the aim to explore the potential of target molecules to decrease the oxidative stress. The title compounds Ox-6a-f were prepared by simple reactions in which a flurbiprofen –COOH group was esterified with methanol in an acid-catalyzed medium, which was then reacted with hydrazine to afford the corresponding hydrazide. The acid hydrazide was then cyclized into 1,3,4-oxadiazole-2-thiol by reacting with CS2 in the presence of KOH. The title compounds Ox-6a-f were synthesized by the reaction of an –SH group with various alkyl/aryl chlorides, which involves an S-alkylation reaction. The structures of the synthesized Ox-6a-f derivatives were ascer- tained by spectroscopic data. The in silico molecular docking was performed against target proteins cyclooxygenase-2 COX-2 (PDBID 5KIR) and cyclooxygenase-1 COX-1 (PDBID 6Y3C) to determine the binding affinity of the synthesized compounds with these structures. It has been inferred that most of the synthesized compounds bind well with an active binding site of 5KIR compared to 6Y3C, and especially compound Ox-6f showed excellent binding affinity (7.70 kcal/mol) among all synthesized compounds Ox-6a-f. The molecular dynamic (MD) simulation has also been performed to check the stability of docking complexes of ligands with COX-2 by determining their root mean square deviation and root mean square fluctuation. Little fluctuation was observed in case of Ox-6f, which forms the most stable complex with COX-2. The comprehensive antioxidant potential of the synthesized compounds has been evaluated by determining their free radical scavenging activity, including DPPH, OH, nitric oxide (NO), and iron chelation assay. The derivative Ox-6f showed promising results with 80.23% radical scavenging potential at a dose of 100 μg/mL while ascorbic acid exhibited 87.72% inhibition at the same dose. The anti-inflammatory activity of the final products has also been performed, and inflammatory markers were assayed, such as a thiobarbituric acid-reducing substance, nitric oxide, interleukin-6 (IL-6), and COX-2. The derivatives Ox-6d and Ox-6f displayed higher anti-inflammatory activity, exhibiting 70.56% and 74.16% activity, respectively. The results were compared with standard ibuprofen, which showed 84.31% activity at the same dose, 200 μg/mL. The anti-inflammatory potential has been performed by following the carrageen-induced hind paw edema model, and results showed that derivative Ox-6f exhibited 79.83% reduction in edema volume compared to standard ibuprofen, which reduced 84.31% edema volume. As dry lab and wet lab results confirm each other, it has been deduced that derivative Ox-6f may serve as the lead structure to design potent compounds to address oxidative stress.Keywords: synthetic chemistry, pharmaceutical chemistry, oxadiazole derivatives, anti-inflammatory, anti-cancer compounds
Procedia PDF Downloads 162656 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform
Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry
Abstract:
The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems
Procedia PDF Downloads 4912655 Efficiency of Robust Heuristic Gradient Based Enumerative and Tunneling Algorithms for Constrained Integer Programming Problems
Authors: Vijaya K. Srivastava, Davide Spinello
Abstract:
This paper presents performance of two robust gradient-based heuristic optimization procedures based on 3n enumeration and tunneling approach to seek global optimum of constrained integer problems. Both these procedures consist of two distinct phases for locating the global optimum of integer problems with a linear or non-linear objective function subject to linear or non-linear constraints. In both procedures, in the first phase, a local minimum of the function is found using the gradient approach coupled with hemstitching moves when a constraint is violated in order to return the search to the feasible region. In the second phase, in one optimization procedure, the second sub-procedure examines 3n integer combinations on the boundary and within hypercube volume encompassing the result neighboring the result from the first phase and in the second optimization procedure a tunneling function is constructed at the local minimum of the first phase so as to find another point on the other side of the barrier where the function value is approximately the same. In the next cycle, the search for the global optimum commences in both optimization procedures again using this new-found point as the starting vector. The search continues and repeated for various step sizes along the function gradient as well as that along the vector normal to the violated constraints until no improvement in optimum value is found. The results from both these proposed optimization methods are presented and compared with one provided by popular MS Excel solver that is provided within MS Office suite and other published results.Keywords: constrained integer problems, enumerative search algorithm, Heuristic algorithm, Tunneling algorithm
Procedia PDF Downloads 3252654 Invasion of Pectinatella magnifica in Freshwater Resources of the Czech Republic
Authors: J. Pazourek, K. Šmejkal, P. Kollár, J. Rajchard, J. Šinko, Z. Balounová, E. Vlková, H. Salmonová
Abstract:
Pectinatella magnifica (Leidy, 1851) is an invasive freshwater animal that lives in colonies. A colony of Pectinatella magnifica (a gelatinous blob) can be up to several feet in diameter large and under favorable conditions it exhibits an extreme growth rate. Recently European countries around rivers of Elbe, Oder, Danube, Rhine and Vltava have confirmed invasion of Pectinatella magnifica, including freshwater reservoirs in South Bohemia (Czech Republic). Our project (Czech Science Foundation, GAČR P503/12/0337) is focused onto biology and chemistry of Pectinatella magnifica. We monitor the organism occurrence in selected South Bohemia ponds and sandpits during the last years, collecting information about physical properties of surrounding water, and sampling the colonies for various analyses (classification, maps of secondary metabolites, toxicity tests). Because the gelatinous matrix is during the colony lifetime also a host for algae, bacteria and cyanobacteria (co-habitants), in this contribution, we also applied a high performance liquid chromatography (HPLC) method for determination of potentially present cyanobacterial toxins (microcystin-LR, microcystin-RR, nodularin). Results from the last 3-year monitoring show that these toxins are under limit of detection (LOD), so that they do not represent a danger yet. The final goal of our study is to assess toxicity risks related to fresh water resources invaded by Pectinatella magnifica, and to understand the process of invasion, which can enable to control it.Keywords: cyanobacteria, fresh water resources, Pectinatella magnifica invasion, toxicity monitoring
Procedia PDF Downloads 2392653 Interfacial Reactions between Aromatic Polyamide Fibers and Epoxy Matrix
Authors: Khodzhaberdi Allaberdiev
Abstract:
In order to understand the interactions on the interface polyamide fibers and epoxy matrix in fiber- reinforced composites were investigated industrial aramid fibers: armos, svm, terlon using individual epoxy matrix components, epoxies: diglycidyl ether of bisphenol A (DGEBA), three- and diglycidyl derivatives of m, p-amino-, m, p-oxy-, o, m,p-carboxybenzoic acids, the models: curing agent, aniline and the compound, that depict of the structure the primary addition reaction the amine to the epoxy resin, N-di (oxyethylphenoxy) aniline. The chemical structure of the surface of untreated and treated polyamide fibers analyzed using Fourier transform infrared spectroscopy (FTIR). The impregnation of fibers with epoxy matrix components and N-di (oxyethylphenoxy) aniline has been carried out by heating 150˚C (6h). The optimum fiber loading is at 65%.The result a thermal treatment is the covalent bonds formation , derived from a combined of homopolymerization and crosslinking mechanisms in the interfacial region between the epoxy resin and the surface of fibers. The reactivity of epoxy resins on interface in microcomposites (MC) also depends from processing aids treated on surface of fiber and the absorbance moisture. The influences these factors as evidenced by the conversion of epoxy groups values in impregnated with DGEBA of the terlons: industrial, dried (in vacuum) and purified samples: 5.20 %, 4.65% and 14.10%, respectively. The same tendency for svm and armos fibers is observed. The changes in surface composition of these MC were monitored by X-ray photoelectron spectroscopy (XPS). In the case of the purified fibers, functional groups of fibers act as well as a catalyst and curing agent of epoxy resin. It is found that the value of the epoxy groups conversion for reinforced formulations depends on aromatic polyamides nature and decreases in the order: armos >svm> terlon. This difference is due of the structural characteristics of fibers. The interfacial interactions also examined between polyglycidyl esters substituted benzoic acids and polyamide fibers in the MC. It is found that on interfacial interactions these systems influences as well as the structure and the isomerism of epoxides. The IR-spectrum impregnated fibers with aniline showed that the polyamide fibers appreciably with aniline do not react. FTIR results of treated fibers with N-di (oxyethylphenoxy) aniline fibers revealed dramatically changes IR-characteristic of the OH groups of the amino alcohol. These observations indicated hydrogen bondings and covalent interactions between amino alcohol and functional groups of fibers. This result also confirms appearance of the exo peak on Differential Scanning Calorimetry (DSC) curve of the MC. Finally, the theoretical evaluation non-covalent interactions between individual epoxy matrix components and fibers has been performed using the benzanilide and its derivative contaning the benzimidazole moiety as a models of terlon and svm,armos, respectively. Quantum-topological analysis also demonstrated the existence hydrogen bond between amide group of models and epoxy matrix components.All the results indicated that on the interface polyamide fibers and epoxy matrix exist not only covalent, but and non-covalent the interactions during the preparation of MC.Keywords: epoxies, interface, modeling, polyamide fibers
Procedia PDF Downloads 2662652 Effect of the Polymer Modification on the Cytocompatibility of Human and Rat Cells
Authors: N. Slepickova Kasalkova, P. Slepicka, L. Bacakova, V. Svorcik
Abstract:
Tissue engineering includes combination of materials and techniques used for the improvement, repair or replacement of the tissue. Scaffolds, permanent or temporally material, are used as support for the creation of the "new cell structures". For this important component (scaffold), a variety of materials can be used. The advantage of some polymeric materials is their cytocompatibility and possibility of biodegradation. Poly(L-lactic acid) (PLLA) is a biodegradable, semi-crystalline thermoplastic polymer. PLLA can be fully degraded into H2O and CO2. In this experiment, the effect of the surface modification of biodegradable polymer (performed by plasma treatment) on the various cell types was studied. The surface parameters and changes of the physicochemical properties of modified PLLA substrates were studied by different methods. Surface wettability was determined by goniometry, surface morphology and roughness study were performed with atomic force microscopy and chemical composition was determined using photoelectron spectroscopy. The physicochemical properties were studied in relation to cytocompatibility of human osteoblast (MG 63 cells), rat vascular smooth muscle cells (VSMC), and human stem cells (ASC) of the adipose tissue in vitro. A fluorescence microscopy was chosen to study and compare cell-material interaction. Important parameters of the cytocompatibility like adhesion, proliferation, viability, shape, spreading of the cells were evaluated. It was found that the modification leads to the change of the surface wettability depending on the time of modification. Short time of exposition (10-120 s) can reduce the wettability of the aged samples, exposition longer than 150 s causes to increase of contact angle of the aged PLLA. The surface morphology is significantly influenced by duration of modification, too. The plasma treatment involves the formation of the crystallites, whose number increases with increasing time of modification. On the basis of physicochemical properties evaluation, the cells were cultivated on the selected samples. Cell-material interactions are strongly affected by material chemical structure and surface morphology. It was proved that the plasma treatment of PLLA has a positive effect on the adhesion, spreading, homogeneity of distribution and viability of all cultivated cells. This effect was even more apparent for the VSMCs and ASCs which homogeneously covered almost the whole surface of the substrate after 7 days of cultivation. The viability of these cells was high (more than 98% for VSMCs, 89-96% for ASCs). This experiment is one part of the basic research, which aims to easily create scaffolds for tissue engineering with subsequent use of stem cells and their subsequent "reorientation" towards the bone cells or smooth muscle cells.Keywords: poly(L-lactic acid), plasma treatment, surface characterization, cytocompatibility, human osteoblast, rat vascular smooth muscle cells, human stem cells
Procedia PDF Downloads 2282651 Eco-Friendly Polymeric Corrosion Inhibitor for Sour Oilfield Environment
Authors: Alireza Rahimi, Abdolreza Farhadian, Arash Tajik, Elaheh Sadeh, Avni Berisha, Esmaeil Akbari Nezhad
Abstract:
Although natural polymers have been shown to have some inhibitory properties on sour corrosion, they are not considered very effective green corrosion inhibitors. Accordingly, effective corrosion inhibitors should be developed based on natural resources to mitigate sour corrosion in the oil and gas industry. Here, Arabic gum was employed as an eco-friendly precursor for the synthesis of innovative polyurethanes designed as highly efficient corrosion inhibitors for sour oilfield solutions. A comprehensive assessment, combining experimental and computational analyses, was conducted to evaluate the inhibitory performance of the inhibitor. Electrochemical measurements demonstrated that a concentration of 200 mM of the inhibitor offered substantial protection to mild steel against sour corrosion, yielding inhibition efficiencies of 98% and 95% at 25 ºC and 60 ºC, respectively. Additionally, the presence of the inhibitor led to a smoother steel surface, indicating the adsorption of polyurethane molecules onto the metal surface. X-ray photoelectron spectroscopy results further validated the chemical adsorption of the inhibitor on mild steel surfaces. Scanning Kelvin probe microscopy revealed a shift in the potential distribution of the steel surface towards negative values, indicating inhibitor adsorption and corrosion process inhibition. Molecular dynamic simulation indicated high adsorption energy values for the inhibitor, suggesting its spontaneous adsorption onto the Fe (110) surface. These findings underscore the potential of Arabic gum as a viable resource for the development of polyurethanes under mild conditions, serving as effective corrosion inhibitors for sour solutions.Keywords: environmental effect, Arabic gum, corrosion inhibitor, sour corrosion, molecular dynamics simulation
Procedia PDF Downloads 622650 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis
Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha
Abstract:
Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier
Procedia PDF Downloads 4672649 Bimetallic Cu/Au Nanostructures and Bio-Application
Authors: Si Yin Tee
Abstract:
Bimetallic nanostructures have received tremendous interests as a new class of nanomaterials which may have better technological usefulness with distinct properties from those of individual atoms and molecules or bulk matter. They excelled over the monometallic counterparts because of their improved electronic, optical and catalytic performances. The properties and the applicability of these bimetallic nanostructures not only depend on their size and shape, but also on the composition and their fine structure. These bimetallic nanostructures are potential candidates for bio-applications such as biosensing, bioimaging, biodiagnostics, drug delivery, targeted therapeutics, and tissue engineering. Herein, gold-incorporated copper (Cu/Au) nanostructures were synthesized through the controlled disproportionation of Cu⁺-oleylamine complex at 220 ºC to form copper nanowires and the subsequent reaction with Au³⁺ at different temperatures of 140, 220 and 300 ºC. This is to achieve their synergistic effect through the combined use of the merits of low-cost transition and high-stability noble metals. Of these Cu/Au nanostructures, Cu/Au nanotubes display the best performance towards electrochemical non-enzymatic glucose sensing, originating from the high conductivity of gold and the high aspect ratio copper nanotubes with high surface area so as to optimise the electroactive sites and facilitate mass transport. In addition to high sensitivity and fast response, the Cu/Au nanotubes possess high selectivity against interferences from other potential interfering species and excellent reproducibility with long-term stability. By introducing gold into copper nanostructures at a low level of 3, 1 and 0.1 mol% relative to initial copper precursor, a significant electrocatalytic enhancement of the resulting bimetallic Cu/Au nanostructures starts to occur at 1 mol%. Overall, the present fabrication of stable Cu/Au nanostructures offers a promising low-cost platform for sensitive, selective, reproducible and reusable electrochemical sensing of glucose.Keywords: bimetallic, electrochemical sensing, glucose oxidation, gold-incorporated copper nanostructures
Procedia PDF Downloads 5212648 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate
Procedia PDF Downloads 152