Search results for: optimized asset allocation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2601

Search results for: optimized asset allocation

2031 The Effectiveness of the Repositioning Campaign of PKO BP Brand on the Basis of Questionnaire Research

Authors: Danuta Szwajca

Abstract:

Image is a very important intangible asset of a contemporary enterprise, especially, in case of a bank as a public trust institution. A positive, demanded image may effectively distinguish the bank among the competition and build the customer confidence and loyalty. PKO BP is the biggest and largest bank functioning on the Polish financial market. Within the years not a very nice image of the bank has been embedded in the customers’ minds as an old-fashioned, stagnant, resistant to changes institution, what result in the customer loss, and ageing. For this reason, in 2010, the bank launched a campaign of radical image change along with a strategy of branches modernization and improvement of the product offer. The objective of the article is to make an attempt of effectiveness assessment of the brand repositioning campaign that lasted three years. The foundations of the assessment are the results of the questionnaire research concerning the way of bank’s perception before and after the campaign.

Keywords: advertising campaign, brand repositioning, image of the bank, repositioning

Procedia PDF Downloads 417
2030 Tax Evasion in Brazil: The Case of Specialists

Authors: Felippe Clemente, Viviani S. Lírio

Abstract:

Brazilian tax evasion is very high. It causes many problems for economics as budget realization, income distribution and no allocation of productive resources. Therefore, the purpose of this article is to use the instrumental game theory to understand tax evasion agents and tax authority in Brazil (Federal Revenue and Federal Police). By means of Game Theory approaches, the main results from considering cases both with and without specialists show that, in a high dropout situation, penalizing taxpayers with either high fines or deprivations of liberty may not be very effective. The analysis also shows that audit and inspection costs play an important role in driving the equilibrium system. This would suggest that a policy of investing in tax inspectors would be a more effective tool in combating non-compliance with tax obligations than penalties or fines.

Keywords: tax evasion, Brazil, game theory, specialists

Procedia PDF Downloads 321
2029 On Multiobjective Optimization to Improve the Scalability of Fog Application Deployments Using Fogtorch

Authors: Suleiman Aliyu

Abstract:

Integrating IoT applications with Fog systems presents challenges in optimization due to diverse environments and conflicting objectives. This study explores achieving Pareto optimal deployments for Fog-based IoT systems to address growing QoS demands. We introduce Pareto optimality to balance competing performance metrics. Using the FogTorch optimization framework, we propose a hybrid approach (Backtracking search with branch and bound) for scalable IoT deployments. Our research highlights the advantages of Pareto optimality over single-objective methods and emphasizes the role of FogTorch in this context. Initial results show improvements in IoT deployment cost in Fog systems, promoting resource-efficient strategies.

Keywords: pareto optimality, fog application deployment, resource allocation, internet of things

Procedia PDF Downloads 74
2028 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.

Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction

Procedia PDF Downloads 532
2027 Decision Tree Modeling in Emergency Logistics Planning

Authors: Yousef Abu Nahleh, Arun Kumar, Fugen Daver, Reham Al-Hindawi

Abstract:

Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability of disaster for each country in the world by using decision tree modeling. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.

Keywords: decision tree modeling, forecasting, humanitarian relief, emergency supply chain

Procedia PDF Downloads 477
2026 Adaptive Energy-Aware Routing (AEAR) for Optimized Performance in Resource-Constrained Wireless Sensor Networks

Authors: Innocent Uzougbo Onwuegbuzie

Abstract:

Wireless Sensor Networks (WSNs) are crucial for numerous applications, yet they face significant challenges due to resource constraints such as limited power and memory. Traditional routing algorithms like Dijkstra, Ad hoc On-Demand Distance Vector (AODV), and Bellman-Ford, while effective in path establishment and discovery, are not optimized for the unique demands of WSNs due to their large memory footprint and power consumption. This paper introduces the Adaptive Energy-Aware Routing (AEAR) model, a solution designed to address these limitations. AEAR integrates reactive route discovery, localized decision-making using geographic information, energy-aware metrics, and dynamic adaptation to provide a robust and efficient routing strategy. We present a detailed comparative analysis using a dataset of 50 sensor nodes, evaluating power consumption, memory footprint, and path cost across AEAR, Dijkstra, AODV, and Bellman-Ford algorithms. Our results demonstrate that AEAR significantly reduces power consumption and memory usage while optimizing path weight. This improvement is achieved through adaptive mechanisms that balance energy efficiency and link quality, ensuring prolonged network lifespan and reliable communication. The AEAR model's superior performance underlines its potential as a viable routing solution for energy-constrained WSN environments, paving the way for more sustainable and resilient sensor network deployments.

Keywords: wireless sensor networks (WSNs), adaptive energy-aware routing (AEAR), routing algorithms, energy, efficiency, network lifespan

Procedia PDF Downloads 28
2025 Development and Characterization Self-Nanoemulsifying Drug Delivery Systems of Poorly Soluble Drug Dutasteride

Authors: Rajinikanth Siddalingam, Poonguzhali Subramanian

Abstract:

The present study aims to prepare and evaluate the self-nano emulsifying drug delivery (SNEDDS) system to enhance the dissolution rate of a poorly soluble drug dutasteride. The formulation was prepared using capryol PGMC, Cremophor EL, and polyethylene glycol (PEG) 400 as oil, surfactant and co-surfactant, respectively. The pseudo-ternary phase diagrams with presence and absence of drug were plotted to find out the nano emulsification range and also to evaluate the effect of dutasteride on the emulsification behavior of the phases. Prepared SNEDDS formulations were evaluated for its particle size distribution, nano emulsifying properties, robustness to dilution, self-emulsification time, turbidity measurement, drug content and in-vitro dissolution. The optimized formulations are further evaluated for heating cooling cycle, centrifugation studies, freeze-thaw cycling, particle size distribution and zeta potential were carried out to confirm the stability of the formed SNEDDS formulations. The particle size, zeta potential and polydispersity index of the optimized formulation found to be 35.45 nm, -15.45 and 0.19, respectively. The in vitro results are revealed that the prepared formulation enhanced the dissolution rate of dutasteride significantly as compared with pure drug. The in vivo studies in was conducted using rats and the results are revealed that SNEDDS formulation has enhanced the bioavailability of dutasteride drug significantly as compared with raw drug. Based the results, it was concluded that the dutasteride-loaded SNEDDS shows potential to enhance the dissolution of dutasteride, thus improving the bioavailability and therapeutic effects.

Keywords: self-emulsifying drug delivery system, dutasteride, enhancement of bioavailability, dissolution enhancement

Procedia PDF Downloads 264
2024 Optimization of Geometric Parameters of Microfluidic Channels for Flow-Based Studies

Authors: Parth Gupta, Ujjawal Singh, Shashank Kumar, Mansi Chandra, Arnab Sarkar

Abstract:

Microfluidic devices have emerged as indispensable tools across various scientific disciplines, offering precise control and manipulation of fluids at the microscale. Their efficacy in flow-based research, spanning engineering, chemistry, and biology, relies heavily on the geometric design of microfluidic channels. This work introduces a novel approach to optimise these channels through Response Surface Methodology (RSM), departing from the conventional practice of addressing one parameter at a time. Traditionally, optimising microfluidic channels involved isolated adjustments to individual parameters, limiting the comprehensive understanding of their combined effects. In contrast, our approach considers the simultaneous impact of multiple parameters, employing RSM to efficiently explore the complex design space. The outcome is an innovative microfluidic channel that consumes an optimal sample volume and minimises flow time, enhancing overall efficiency. The relevance of geometric parameter optimization in microfluidic channels extends significantly in biomedical engineering. The flow characteristics of porous materials within these channels depend on many factors, including fluid viscosity, environmental conditions (such as temperature and humidity), and specific design parameters like sample volume, channel width, channel length, and substrate porosity. This intricate interplay directly influences the performance and efficacy of microfluidic devices, which, if not optimized, can lead to increased costs and errors in disease testing and analysis. In the context of biomedical applications, the proposed approach addresses the critical need for precision in fluid flow. it mitigate manufacturing costs associated with trial-and-error methodologies by optimising multiple geometric parameters concurrently. The resulting microfluidic channels offer enhanced performance and contribute to a streamlined, cost-effective process for testing and analyzing diseases. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing.

Keywords: microfluidic device, minitab, statistical optimization, response surface methodology

Procedia PDF Downloads 55
2023 High Sensitivity Crack Detection and Locating with Optimized Spatial Wavelet Analysis

Authors: A. Ghanbari Mardasi, N. Wu, C. Wu

Abstract:

In this study, a spatial wavelet-based crack localization technique for a thick beam is presented. Wavelet scale in spatial wavelet transformation is optimized to enhance crack detection sensitivity. A windowing function is also employed to erase the edge effect of the wavelet transformation, which enables the method to detect and localize cracks near the beam/measurement boundaries. Theoretical model and vibration analysis considering the crack effect are first proposed and performed in MATLAB based on the Timoshenko beam model. Gabor wavelet family is applied to the beam vibration mode shapes derived from the theoretical beam model to magnify the crack effect so as to locate the crack. Relative wavelet coefficient is obtained for sensitivity analysis by comparing the coefficient values at different positions of the beam with the lowest value in the intact area of the beam. Afterward, the optimal wavelet scale corresponding to the highest relative wavelet coefficient at the crack position is obtained for each vibration mode, through numerical simulations. The same procedure is performed for cracks with different sizes and positions in order to find the optimal scale range for the Gabor wavelet family. Finally, Hanning window is applied to different vibration mode shapes in order to overcome the edge effect problem of wavelet transformation and its effect on the localization of crack close to the measurement boundaries. Comparison of the wavelet coefficients distribution of windowed and initial mode shapes demonstrates that window function eases the identification of the cracks close to the boundaries.

Keywords: edge effect, scale optimization, small crack locating, spatial wavelet

Procedia PDF Downloads 356
2022 The Impact of Bitcoin on Stock Market Performance

Authors: Oliver Takawira, Thembi Hope

Abstract:

This study will analyse the relationship between Bitcoin price movements and the Johannesburg stock exchange (JSE). The aim is to determine whether Bitcoin price movements affect the stock market performance. As crypto currencies continue to gain prominence as a safe asset during periods of economic distress, this raises the question of whether Bitcoin’s prosperity could affect investment in the stock market. To identify the existence of a short run and long run linear relationship, the study will apply the Autoregressive Distributed Lag Model (ARDL) bounds test and a Vector Error Correction Model (VECM) after testing the data for unit roots and cointegration using the Augmented Dicker Fuller (ADF) and Phillips-Perron (PP). The Non-Linear Auto Regressive Distributed Lag (NARDL) will then be used to check if there is a non-linear relationship between bitcoin prices and stock market prices.

Keywords: bitcoin, stock market, interest rates, ARDL

Procedia PDF Downloads 97
2021 Employee Branding: An Exploratory Study Applied to Nurses in an Organization

Authors: Pawan Hinge, Priya Gupta

Abstract:

Due to cutting edge competitions between organizations and war for talent, the workforce as an asset is gaining significance. The employees are considered as the brand ambassadors of an organization, and their interactions with the clients and customers might impact directly or indirectly on the overall value of the organization. Especially, organizations in the healthcare industry the value of an organization in the perception of their employees can be one of the revenue generating and talent retention strategy. In such context, it is essential to understand that the brand awareness among employees can effect on employer brand image and brand value since the brand ambassadors are the interface between organization and customers and clients. In this exploratory study, we have adopted both quantitative and qualitative approaches for data analysis. Our study shows existing variation among nurses working in different business units of the same organization in terms of their customer interface or interactions and brand awareness.

Keywords: brand awareness, brand image, brand value, customer interface

Procedia PDF Downloads 281
2020 Evidence on Scale Economies in National Bank of Pakistan

Authors: Sohail Zafar, Sardar Javaid Iqbal Khan

Abstract:

We use a parametric approach within a translog cost function framework to estimate the economies of scale in National Bank of Pakistan from 1997 to 2013. The results indicate significant economies of scale throughout the sample at aggregates and disaggregates taking in account size subject to stipulation ownership. The factor markets often produce scale inefficiencies in the banking of developing countries like Pakistan such inefficiencies are common due to distortion in factor markets leading to the use of inappropriate factor proportions. The findings suggest that National Bank of Pakistan diversify their asset portfolios that it has cost advantage, therefore, expansion in size should be encouraged under current technology because it appears to be cost effective. In addition, our findings support the implementation of universal banking model in Pakistan.

Keywords: scale economies, cost function, disaggregates, aggregates

Procedia PDF Downloads 317
2019 The Invisible Asset Influence on Corporate Performance: A Case Study

Authors: Hassan Medaghri Alaoui

Abstract:

The accounting and financial reporting system in use today is over 500 years old and has failed to capture the new knowledge and innovation economy in which intangible assets are becoming increasingly valuable. Yet, there has been a growing acknowledgment among the research community as to the relevance of intellectual capital as a major enhancer of an organization’s well-being. Much of the research provides great support for how the IC is instrumental in determining financial and stock performances. As far as we know, this article is one of the earliest exploratory attempts to examine the intellectual capital impact on the corporate performance of the IT sector in Morocco. The purpose of this study is to verify empirically the influence of intellectual capital on firm performance. We have undertaken, over a fifteen-year period, a longitudinal (2005–2019) case study of a prominent payment-solutions company based in a developing economy with global operations.

Keywords: intellectual capital, IT sector, measuring intellectual capital, modified value added intellectual capital coefficient, Morocco

Procedia PDF Downloads 114
2018 Seismic Retrofits – A Catalyst for Minimizing the Building Sector’s Carbon Footprint

Authors: Juliane Spaak

Abstract:

A life-cycle assessment was performed, looking at seven retrofit projects in New Zealand using LCAQuickV3.5. The study found that retrofits save up to 80% of embodied carbon emissions for the structural elements compared to a new building. In other words, it is only a 20% carbon investment to transform and extend a building’s life. In addition, the systems were evaluated by looking at environmental impacts over the design life of these buildings and resilience using FEMA P58 and PACT software. With the increasing interest in Zero Carbon targets, significant changes in the building and construction sector are required. Emissions for buildings arise from both embodied carbon and operations. Based on the significant advancements in building energy technology, the focus is moving more toward embodied carbon, a large portion of which is associated with the structure. Since older buildings make up most of the real estate stock of our cities around the world, their reuse through structural retrofit and wider refurbishment plays an important role in extending the life of a building’s embodied carbon. New Zealand’s building owners and engineers have learned a lot about seismic issues following a decade of significant earthquakes. Recent earthquakes have brought to light the necessity to move away from constructing code-minimum structures that are designed for life safety but are frequently ‘disposable’ after a moderate earthquake event, especially in relation to a structure’s ability to minimize damage. This means weaker buildings sit as ‘carbon liabilities’, with considerably more carbon likely to be expended remediating damage after a shake. Renovating and retrofitting older assets plays a big part in reducing the carbon profile of the buildings sector, as breathing new life into a building’s structure is vastly more sustainable than the highest quality ‘green’ new builds, which are inherently more carbon-intensive. The demolition of viable older buildings (often including heritage buildings) is increasingly at odds with society’s desire for a lower carbon economy. Bringing seismic resilience and carbon best practice together in decision-making can open the door to commercially attractive outcomes, with retrofits that include structural and sustainability upgrades transforming the asset’s revenue generation. Across the global real estate market, tenants are increasingly demanding the buildings they occupy be resilient and aligned with their own climate targets. The relationship between seismic performance and ‘sustainable design’ has yet to fully mature, yet in a wider context is of profound consequence. A whole-of-life carbon perspective on a building means designing for the likely natural hazards within the asset’s expected lifespan, be that earthquake, storms, damage, bushfires, fires, and so on, ¬with financial mitigation (e.g., insurance) part, but not all, of the picture.

Keywords: retrofit, sustainability, earthquake, reuse, carbon, resilient

Procedia PDF Downloads 65
2017 Energy Efficiency Analysis of Crossover Technologies in Industrial Applications

Authors: W. Schellong

Abstract:

Industry accounts for one-third of global final energy demand. Crossover technologies (e.g. motors, pumps, process heat, and air conditioning) play an important role in improving energy efficiency. These technologies are used in many applications independent of the production branch. Especially electrical power is used by drives, pumps, compressors, and lightning. The paper demonstrates the algorithm of the energy analysis by some selected case studies for typical industrial processes. The energy analysis represents an essential part of energy management systems (EMS). Generally, process control system (PCS) can support EMS. They provide information about the production process, and they organize the maintenance actions. Combining these tools into an integrated process allows the development of an energy critical equipment strategy. Thus, asset and energy management can use the same common data to improve the energy efficiency.

Keywords: crossover technologies, data management, energy analysis, energy efficiency, process control

Procedia PDF Downloads 204
2016 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions

Authors: Pirta Palola, Richard Bailey, Lisa Wedding

Abstract:

Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.

Keywords: economics of biodiversity, environmental valuation, natural capital, value function

Procedia PDF Downloads 189
2015 Calibration of Hybrid Model and Arbitrage-Free Implied Volatility Surface

Authors: Kun Huang

Abstract:

This paper investigates whether the combination of local and stochastic volatility models can be calibrated exactly to any arbitrage-free implied volatility surface of European option. The risk neutral Brownian Bridge density is applied for calibration of the leverage function of our Hybrid model. Furthermore, the tails of marginal risk neutral density are generated by Generalized Extreme Value distribution in order to capture the properties of asset returns. The local volatility is generated from the arbitrage-free implied volatility surface using stochastic volatility inspired parameterization.

Keywords: arbitrage free implied volatility, calibration, extreme value distribution, hybrid model, local volatility, risk-neutral density, stochastic volatility

Procedia PDF Downloads 263
2014 Biosafety Study of Genetically Modified CEMB Sugarcane on Animals for Glyphosate Tolerance

Authors: Aminah Salim, Idrees Ahmed Nasir, Abdul Qayyum Rao, Muhammad Ali, Muhammad Sohail Anjum, Ayesha Hameed, Bushra Tabassum, Anwar Khan, Arfan Ali, Mariyam Zameer, Tayyab Husnain

Abstract:

Risk assessment of transgenic herbicide tolerant sugarcane having CEMB codon optimized cp4EPSPS gene was done in present study. Fifteen days old chicks taken from K&Ns Company were randomly assorted into four groups with eight chicks in each group namely control chicken group fed with commercial diet, non-transgenic group fed with non-experimental sugarcane and transgenic group fed with transgenic sugarcane with minimum and maximum level. Body weights, biochemical analysis for Urea, alkaline phosphatase, alanine transferase, aspartate transferase, creatinine and bilirubin determination and histological examination of chicks fed with four types of feed was taken at fifteen days interval and no significant difference was observed in body weight biochemical and histological studies of all four groups. Protein isolated from the serum sample was analyzed through dipstick and SDS-PAGE, showing the absence of transgene protein in the serum sample of control and experimental groups. Moreover the amplification of cp4EPSPS gene with gene specific primers of DNA isolated from chicks blood and also from commercial diet was done to determine the presence and mobility of any nucleotide fragment of the transgene in/from feed and no amplification was obtained in feed as well as in blood extracted DNA of any group. Also no mRNA expression of cp4EPSPS gene was obtained in any tissue of four groups of chicks. From the results it is clear that there is no deleterious or harmful effect of the CEMB codon optimized transgenic cp4EPSPS sugarcane on the chicks health.

Keywords: chicks, cp4EPSPS, glyphosate, sugarcane

Procedia PDF Downloads 366
2013 Mitigating the Vulnerability of Subsistence Farmers through Ground Water Optimisation

Authors: Olayemi Bakre

Abstract:

The majoritant of the South African rural populace are directly or indirectly engaged in agricultural practices for a livelihood. However, impediments such as the climate change and inadequacy of governmental support has undermined the once thriving subsistence farming communities of South Africa. Furthermore, the poor leadership in hydrology, coupled with lack of depths in skills to facilitate the understanding and acceptance of groundwater from national level to local governance has made it near impossible for subsistence farmers to optimally benefit from the groundwater beneath their feet. The 2012 drought experienced in South Africa paralysed the farming activities across several subsistence farming communities across the KwaZulu-Natal Province. To revamp subsistence farming, a variety of interventions and strategies such as the Resource Poor Farmers (RPF) and Water Allocation Reforms (WAR) have been launched by the Department of Water and Sanitation (DWS) as an agendum to galvanising the defunct subsistence farming communities of KwaZulu-Natal as well as other subsistence farming communities across South Africa. Despite the enormous resources expended on the subsistence farming communities whom often fall under the Historically Disadvantaged Individuals (HDI); indicators such as the unsustainable farming practices, poor crop yield, pitiable living condition as well as the poor standard of living, are evidential to the claim that these afore cited interventions and a host of other similar strategies indicates that these initiatives have not yield the desired result. Thus, this paper seeks to suggest practicable interventions aimed at salvaging the vulnerability of subsistence farmers within the province understudy. The study pursued a qualitative approach as the view of experts on ground water and similarly related fields from the DWS were solicited as an agendum to obtaining in-depth perspective into the current study. Some of the core challenges undermining the sustainability and growth of subsistence farming in the area of study were - inadequacy of experts (engineers, scientist, researchers) in ground water; water shortages; lack of political will as well as lack of coordination among stakeholders. As an agendum to optimising the ground water usage for subsistence farming, this paper advocates the strengthening of geohydrological skills, development of technical training capacity, interactive participation among stakeholders as well as the initiation of Participatory Action Research as an agenda to optimising the available ground water in KwaZulu-Natal which is intended to orchestrate a sustainable and viable subsistence farming practice within the province.

Keywords: subsistence farming, ground water optimisation, resource poor farmers, and water allocation reforms, hydrology

Procedia PDF Downloads 241
2012 Approaching a Tat-Rev Independent HIV-1 Clone towards a Model for Research

Authors: Walter Vera-Ortega, Idoia Busnadiego, Sam J. Wilson

Abstract:

Introduction: Human Immunodeficiency Virus type 1 (HIV-1) is responsible for the acquired immunodeficiency syndrome (AIDS), a leading cause of death worldwide infecting millions of people each year. Despite intensive research in vaccine development, therapies against HIV-1 infection are not curative, and the huge genetic variability of HIV-1 challenges to drug development. Current animal models for HIV-1 research present important limitations, impairing the progress of in vivo approaches. Macaques require a CD8+ depletion to progress to AIDS, and the maintenance cost is high. Mice are a cheaper alternative but need to be 'humanized,' and breeding is not possible. The development of an HIV-1 clone able to replicate in mice is a challenging proposal. The lack of human co-factors in mice impedes the function of the HIV-1 accessory proteins, Tat and Rev, hampering HIV-1 replication. However, Tat and Rev function can be replaced by constitutive/chimeric promoters, codon-optimized proteins and the constitutive transport element (CTE), generating a novel HIV-1 clone able to replicate in mice without disrupting the amino acid sequence of the virus. By minimally manipulating the genomic 'identity' of the virus, we propose the generation of an HIV-1 clone able to replicate in mice to assist in antiviral drug development. Methods: i) Plasmid construction: The chimeric promoters and CTE copies were cloned by PCR using lentiviral vectors as templates (pCGSW and pSIV-MPCG). Tat mutants were generated from replication competent HIV-1 plasmids (NHG and NL4-3). ii) Infectivity assays: Retroviral vectors were generated by transfection of human 293T cells and murine NIH 3T3 cells. Virus titre was determined by flow cytometry measuring GFP expression. Human B-cells (AA-2) and Hela cells (TZMbl) were used for infectivity assays. iii) Protein analysis: Tat protein expression was determined by TZMbl assay and HIV-1 capsid by western blot. Results: We have determined that NIH 3T3 cells are able to generate HIV-1 particles. However, they are not infectious, and further analysis needs to be performed. Codon-optimized HIV-1 constructs are efficiently made in 293T cells in a Tat and Rev independent manner and capable of packaging a competent genome in trans. CSGW is capable of generating infectious particles in the absence of Tat and Rev in human cells when 4 copies of the CTE are placed preceding the 3’LTR. HIV-1 Tat mutant clones encoding different promoters are functional during the first cycle of replication when Tat is added in trans. Conclusion: Our findings suggest that the development of an HIV-1 Tat-Rev independent clone is challenging but achievable aim. However, further investigations need to be developed prior presenting our HIV-1 clone as a candidate model for research.

Keywords: codon-optimized, constitutive transport element, HIV-1, long terminal repeats, research model

Procedia PDF Downloads 303
2011 Can Sustainability Help Achieve Social Justice?

Authors: Maryam Davodi-Far

Abstract:

Although sustainability offers a vision to preserve the earth’s resources while sustaining life on earth, there tends to be injustice and disparity in how resources are allocated across the globe. As such, the question that arises is whom will sustainability benefit? Will the rich grow richer and the poor become worse off? Is there a way to find balance between sustainability and still implement and achieve success with distributive justice theories? One of the facets of justice is distributive justice; the idea of balancing benefits and costs associated with the way in which we disseminate and consume goods. Social justice relies on how the cost and burdens of our resource allocation can be done reasonably and equitably and spread across a number of societies, and within each society spread across diverse groups and communities. In the end, the question is how to interact with the environment and diverse communities of today and of those communities of the future.

Keywords: consumerism, sustainability, sustainable development, social justice, social equity, distributive justice

Procedia PDF Downloads 399
2010 Contemporary Terrorism: Root Causes and Misconceptions

Authors: Thomas Slunecko Karat

Abstract:

The years since 9/11 2001 have given us a plethora of research papers with the word ‘terrorism’ in the title. Yet only a small subset of these papers has produced new data, which explains why more than 20 years of research since 9/11 have done little to increase our understanding of the mechanisms that lead to terrorism. Specifically, terrorism scholars are divided by political, temporal, geographical and financial demarcation lines which prevent a clear definition of terrorism. As a consequence, the true root causes of terrorism remain unexamined. Instead, the psychopathological conditions of the individual have been emphasized despite ample empirical evidence pointing in a different direction. This paper examines the underlying reasons and motives that prevent open discourse about the root causes of terrorism and proposes that terrorism is linked to the current international system of resource allocation and systematic violations of human rights.

Keywords: terrorism, root causes of terrorism, prevention of terrorism, racism, human rights violations

Procedia PDF Downloads 88
2009 Design and Fabrication of Stiffness Reduced Metallic Locking Compression Plates through Topology Optimization and Additive Manufacturing

Authors: Abdulsalam A. Al-Tamimi, Chris Peach, Paulo Rui Fernandes, Paulo J. Bartolo

Abstract:

Bone fixation implants currently used to treat traumatic fractured bones and to promote fracture healing are built with biocompatible metallic materials such as stainless steel, cobalt chromium and titanium and its alloys (e.g., CoCrMo and Ti6Al4V). The noticeable stiffness mismatch between current metallic implants and host bone associates with negative outcomes such as stress shielding which causes bone loss and implant loosening leading to deficient fracture treatment. This paper, part of a major research program to design the next generation of bone fixation implants, describes the combined use of three-dimensional (3D) topology optimization (TO) and additive manufacturing powder bed technology (Electron Beam Melting) to redesign and fabricate the plates based on the current standard one (i.e., locking compression plate). Topology optimization is applied with an objective function to maximize the stiffness and constraint by volume reductions (i.e., 25-75%) in order to obtain optimized implant designs with reduced stress shielding phenomenon, under different boundary conditions (i.e., tension, bending, torsion and combined loads). The stiffness of the original and optimised plates are assessed through a finite-element study. The TO results showed actual reduction in the stiffness for most of the plates due to the critical values of volume reduction. Additionally, the optimized plates fabricated using powder bed techniques proved that the integration between the TO and additive manufacturing presents the capability of producing stiff reduced plates with acceptable tolerances.

Keywords: additive manufacturing, locking compression plate, finite element, topology optimization

Procedia PDF Downloads 196
2008 Calculation of Electronic Structures of Nickel in Interaction with Hydrogen by Density Functional Theoretical (DFT) Method

Authors: Choukri Lekbir, Mira Mokhtari

Abstract:

Hydrogen-Materials interaction and mechanisms can be modeled at nano scale by quantum methods. In this work, the effect of hydrogen on the electronic properties of a cluster material model «nickel» has been studied by using of density functional theoretical (DFT) method. Two types of clusters are optimized: Nickel and hydrogen-nickel system. In the case of nickel clusters (n = 1-6) without presence of hydrogen, three types of electronic structures (neutral, cationic and anionic), have been optimized according to three basis sets calculations (B3LYP/LANL2DZ, PW91PW91/DGDZVP2, PBE/DGDZVP2). The comparison of binding energies and bond lengths of the three structures of nickel clusters (neutral, cationic and anionic) obtained by those basis sets, shows that the results of neutral and anionic nickel clusters are in good agreement with the experimental results. In the case of neutral and anionic nickel clusters, comparing energies and bond lengths obtained by the three bases, shows that the basis set PBE/DGDZVP2 is most suitable to experimental results. In the case of anionic nickel clusters (n = 1-6) with presence of hydrogen, the optimization of the hydrogen-nickel (anionic) structures by using of the basis set PBE/DGDZVP2, shows that the binding energies and bond lengths increase compared to those obtained in the case of anionic nickel clusters without the presence of hydrogen, that reveals the armor effect exerted by hydrogen on the electronic structure of nickel, which due to the storing of hydrogen energy within nickel clusters structures. The comparison between the bond lengths for both clusters shows the expansion effect of clusters geometry which due to hydrogen presence.

Keywords: binding energies, bond lengths, density functional theoretical, geometry optimization, hydrogen energy, nickel cluster

Procedia PDF Downloads 417
2007 The Development and Provision of a Knowledge Management Ecosystem, Optimized for Genomics

Authors: Matthew I. Bellgard

Abstract:

The field of bioinformatics has made, and continues to make, substantial progress and contributions to life science research and development. However, this paper contends that a systems approach integrates bioinformatics activities for any project in a defined manner. The application of critical control points in this bioinformatics systems approach may be useful to identify and evaluate points in a pathway where specified activity risk can be reduced, monitored and quality enhanced.

Keywords: bioinformatics, food security, personalized medicine, systems approach

Procedia PDF Downloads 418
2006 Equity Investment Restrictions and Pension Replacement Rates in Nigeria: A Ruin-Risk Analysis

Authors: Uche A. Ibekwe

Abstract:

Pension funds are pooled assets which are established to provide income for retirees. The funds are usually regulated to check excessive risk taking by fund managers. In Nigeria, the current defined contribution (DC) pension scheme appears to contain some overly stringent restrictions which might be hampering its successful implementation. Notable among these restrictions is the 25 percent maximum limit on investment in ordinary shares of quoted companies. This paper examines the extent to which these restrictions affect pension replacement rates at retirement. The study made use of both simulated and historical asset return distributions using mean-variance, regression analysis and ruin-risk analyses, the study found that the current equity investment restriction policy in Nigeria reduces replacement rates at retirement.

Keywords: equity investment, replacement rates, restrictions, ruin-risk

Procedia PDF Downloads 337
2005 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 189
2004 Reliability and Availability Analysis of Satellite Data Reception System using Reliability Modeling

Authors: Ch. Sridevi, S. P. Shailender Kumar, B. Gurudayal, A. Chalapathi Rao, K. Koteswara Rao, P. Srinivasulu

Abstract:

System reliability and system availability evaluation plays a crucial role in ensuring the seamless operation of complex satellite data reception system with consistent performance for longer periods. This paper presents a novel approach for the same using a case study on one of the antenna systems at satellite data reception ground station in India. The methodology involves analyzing system's components, their failure rates, system's architecture, generation of logical reliability block diagram model and estimating the reliability of the system using the component level mean time between failures considering exponential distribution to derive a baseline estimate of the system's reliability. The model is then validated with collected system level field failure data from the operational satellite data reception systems that includes failure occurred, failure time, criticality of the failure and repair times by using statistical techniques like median rank, regression and Weibull analysis to extract meaningful insights regarding failure patterns and practical reliability of the system and to assess the accuracy of the developed reliability model. The study mainly focused on identification of critical units within the system, which are prone to failures and have a significant impact on overall performance and brought out a reliability model of the identified critical unit. This model takes into account the interdependencies among system components and their impact on overall system reliability and provides valuable insights into the performance of the system to understand the Improvement or degradation of the system over a period of time and will be the vital input to arrive at the optimized design for future development. It also provides a plug and play framework to understand the effect on performance of the system in case of any up gradations or new designs of the unit. It helps in effective planning and formulating contingency plans to address potential system failures, ensuring the continuity of operations. Furthermore, to instill confidence in system users, the duration for which the system can operate continuously with the desired level of 3 sigma reliability was estimated that turned out to be a vital input to maintenance plan. System availability and station availability was also assessed by considering scenarios of clash and non-clash to determine the overall system performance and potential bottlenecks. Overall, this paper establishes a comprehensive methodology for reliability and availability analysis of complex satellite data reception systems. The results derived from this approach facilitate effective planning contingency measures, and provide users with confidence in system performance and enables decision-makers to make informed choices about system maintenance, upgrades and replacements. It also aids in identifying critical units and assessing system availability in various scenarios and helps in minimizing downtime and optimizing resource allocation.

Keywords: exponential distribution, reliability modeling, reliability block diagram, satellite data reception system, system availability, weibull analysis

Procedia PDF Downloads 79
2003 Corporate Governance and Financial Performance: Evidence From Indonesian Islamic Banks

Authors: Ummu Salma Al Azizah, Herri Mulyono, Anisa Mauliata Suryana

Abstract:

The significance of corporate governance regarding to the agency problem have been transparent. This study examine the impact of corporate governance on the performance of Islamic banking in Indonesia. By using fixed effect model and added some control variable, the current study try to explore the correlation between the theoretical framework on corporate governance, such as agency theory and risk management theory. The bank performance (Return on Asset and Return on Equity) which are operational performance and financial performance. And Corporate governance based on Board size, CEO duality, Audit committee and Shariah supervisory board. The limitation of this study only focus on the Islamic banks performance from year 2015 to 2020. The study fill the gap in the literature by addressing the issue of corporate governance on Islamic banks performance in Indonesia.

Keywords: corporate governance, financial performance, islamic banks, listed companies, Indonesia

Procedia PDF Downloads 117
2002 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 100