Search results for: NMR techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6532

Search results for: NMR techniques

2362 Asymptotic Analysis of the Viscous Flow through a Pipe and the Derivation of the Darcy-Weisbach Law

Authors: Eduard Marusic-Paloka

Abstract:

The Darcy-Weisbach formula is used to compute the pressure drop of the fluid in the pipe, due to the friction against the wall. Because of its simplicity, the Darcy-Weisbach formula became widely accepted by engineers and is used for laminar as well as the turbulent flows through pipes, once the method to compute the mysterious friction coefficient was derived. Particularly in the second half of the 20th century. Formula is empiric, and our goal is to derive it from the basic conservation law, via rigorous asymptotic analysis. We consider the case of the laminar flow but with significant Reynolds number. In case of the perfectly smooth pipe, the situation is trivial, as the Navier-Stokes system can be solved explicitly via the Poiseuille formula leading to the friction coefficient in the form 64/Re. For the rough pipe, the situation is more complicated and some effects of the roughness appear in the friction coefficient. We start from the Navier-Stokes system in the pipe with periodically corrugated wall and derive an asymptotic expansion for the pressure and for the velocity. We use the homogenization techniques and the boundary layer analysis. The approximation derived by formal analysis is then justified by rigorous error estimate in the norm of the appropriate Sobolev space, using the energy formulation and classical a priori estimates for the Navier-Stokes system. Our method leads to the formula for the friction coefficient. The formula involves resolution of the appropriate boundary layer problems, namely the boundary value problems for the Stokes system in an infinite band, that needs to be done numerically. However, theoretical analysis characterising their nature can be done without solving them.

Keywords: Darcy-Weisbach law, pipe flow, rough boundary, Navier law

Procedia PDF Downloads 342
2361 Analysis of Long-term Results After External Dacryocystorhinostomy Surgery in Patients Suffered from Diabetes Mellitus

Authors: N. Musayeva, N. Rustamova, N. Bagirov, S. Ibadov

Abstract:

Purpose: to analyze the long-term results of external dacryocystorhinostomy (DCR), which remains the preferred primary procedure in the surgical treatment of lacrimal duct obstruction in chronic dacryocystitis. Methodology: long-term results of external DCR (after 3 years) performed on 90 patients (90 eyes) with chronic dacryocystitis from 2018 to 2020 were evaluated. The Azerbaijan National Center of Ophthalmology, named after acad. Zarifa Aliyeva. 15 of the patients were men, 75 – women. The average age was 45±3.2 years. Surgical operations were performed under local anesthesia. All patients suffered from diabetes mellitus for more than 3 years. All patients underwent external DCR and silicone drainage (tube) was implanted. In the postoperative period (after 3 years), lacrimation, purulent discharge, and the condition of the scar at the operation site were assessed. Results: All patients were under observation for more than 18 months. In general, the effectiveness of the surgical operation was 93.34%. Recurrence of disease was observed in 6 patients and in 3 patients (3.33%), the scar at the site of the operation was rough (non-cosmetic). In 3 patients (3.33%) – the surgically formed anastomosis between the lacrimal sac and the nasal bone was obstructed by scar tissue. These patients were reoperated by trans canalicular laser DCR. Conclusion: Despite the long-term (more than a hundred years) use of external DCR, it remains one of the primary techniques in the surgery of chronic dacryocystitis. Due to the high success rate and good long-term results of DCR in the treatment of chronic dacryocystitis in patients suffering from diabetes mellitus, we recommend external DCR for this group of patients.

Keywords: chronic dacryocystitis, diabetes mellitus, external dacryocystorhinostomy, long-term results

Procedia PDF Downloads 54
2360 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques

Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba

Abstract:

The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.

Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry

Procedia PDF Downloads 178
2359 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons

Authors: Ozgu Hafizoglu

Abstract:

Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.

Keywords: analogy, analogical reasoning, cognitive model, brain and glials

Procedia PDF Downloads 170
2358 Relay-Augmented Bottleneck Throughput Maximization for Correlated Data Routing: A Game Theoretic Perspective

Authors: Isra Elfatih Salih Edrees, Mehmet Serdar Ufuk Türeli

Abstract:

In this paper, an energy-aware method is presented, integrating energy-efficient relay-augmented techniques for correlated data routing with the goal of optimizing bottleneck throughput in wireless sensor networks. The system tackles the dual challenge of throughput optimization while considering sensor network energy consumption. A unique routing metric has been developed to enable throughput maximization while minimizing energy consumption by utilizing data correlation patterns. The paper introduces a game theoretic framework to address the NP-complete optimization problem inherent in throughput-maximizing correlation-aware routing with energy limitations. By creating an algorithm that blends energy-aware route selection strategies with the best reaction dynamics, this framework provides a local solution. The suggested technique considerably raises the bottleneck throughput for each source in the network while reducing energy consumption by choosing the best routes that strike a compromise between throughput enhancement and energy efficiency. Extensive numerical analyses verify the efficiency of the method. The outcomes demonstrate the significant decrease in energy consumption attained by the energy-efficient relay-augmented bottleneck throughput maximization technique, in addition to confirming the anticipated throughput benefits.

Keywords: correlated data aggregation, energy efficiency, game theory, relay-augmented routing, throughput maximization, wireless sensor networks

Procedia PDF Downloads 53
2357 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI

Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De

Abstract:

Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.

Keywords: aquaculture farms, LULC, Mangrove, NDVI

Procedia PDF Downloads 163
2356 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm

Authors: J. S. Dhillon, K. K. Dhaliwal

Abstract:

In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.

Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization

Procedia PDF Downloads 459
2355 Rehabilitation and Conservation of Mangrove Forest as Pertamina Corporate Social Responsibility Approach in Prevention Damage Climate in Indonesia

Authors: Nor Anisa

Abstract:

This paper aims to describe the use of conservation and rehabilitation of Mangrove forests as an alternative area in protecting the natural environment and ecosystems and ecology, community education and innovation of sustainable industrial development such as oil companies, gas and coal. The existence of globalization encourages energy needs such as gas, diesel and coal as an unaffected resource which is a basic need for human life while environmental degradation and natural phenomena continue to occur in Indonesia, especially global warming, sea water pollution, extinction of animal steps. The phenomenon or damage to nature in Indonesia is caused by a population explosion in Indonesia that causes unemployment, the land where the residence will disappear so that this will encourage the exploitation of nature and the environment. Therefore, Pertamina as a state-owned oil and gas company carries out its social responsibility efforts, namely to carry out conservation and rehabilitation and management of Mangrove fruit seeds which will provide an educational effect on the benefits of Mangrove seed maintenance. The method used in this study is a qualitative method and secondary data retrieval techniques where data is taken based on Pertamina activity journals and websites that can be accounted for. So the conclusion of this paper is: the benefits and function of conservation of mangrove forests in Indonesia physically, chemically, biologically and socially and economically and can provide innovation to the CSR (Corporate Social Responsibility) of the company in continuing social responsibility in the scope of environmental conservation and social education.

Keywords: mangrove, environmental damage, conservation and rehabilitation, innovation of corporate social responsibility

Procedia PDF Downloads 121
2354 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image

Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche

Abstract:

The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.

Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter

Procedia PDF Downloads 147
2353 Synthesis of Zeolites from Bauxite and Kaolin: Effect of Synthesis Parameters on Competing Phases

Authors: Bright Kwakye-Awuah, Elizabeth Von-Kiti, Isaac Nkrumah, Baah Sefa-Ntiri, Craig D. Williams

Abstract:

Bauxite and kaolin from Ghana Bauxite Company mine site were used to synthesize zeolites. Bauxite served as the alumina source and kaolin the silica source. Synthesis variations include variation of aging time at constant crystallization time and variation of crystallization times at constant aging time. Characterization techniques such as X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX) and Fourier transform infrared spectroscopy (FTIR) were employed in the characterization of the raw samples as well as the synthesized samples. The results obtained showed that the transformations that occurred and the phase of the resulting products were coordinated by the aging time, crystallization time, alkaline concentration and Si/Al ratio of the system. Zeolites A, X, Y, analcime, Sodalite, and ZK-14 were some of the phases achieved. Zeolite LTA was achieved with short crystallization times of 3, 5, 18 and 24 hours and a maximum aging of 24 hours. Zeolite LSX was synthesized with 24 hr aging followed with 24 hr hydrothermal treatment whilst zeolite Y crystallized after 48 hr of aging and 24 hr crystallization. Prolonged crystallization time produced a mixed phased product. Prolonged aging times, on the other hand, did not yield any zeolite as the sample was amorphous. Increasing the alkaline content of the reaction mixture above 5M introduced sodalite phase in the final product. The properties of the final products were comparable to zeolites synthesized from pure chemical reagents.

Keywords: bauxite, kaolin, aging, crystallization, zeolites

Procedia PDF Downloads 205
2352 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 36
2351 Methane Oxidation to Methanol Catalyzed by Copper Oxide Clusters Supported in MIL-53(Al): A Density Functional Theory Study

Authors: Chun-Wei Yeh, Santhanamoorthi Nachimuthu, Jyh-Chiang Jiang

Abstract:

Reducing greenhouse gases or converting them into fuels and chemicals with added value is vital for the environment. Given the enhanced techniques for hydrocarbon extraction in this context, the catalytic conversion of methane to methanol is particularly intriguing for future applications as vehicle fuels and/or bulk chemicals. Metal-organic frameworks (MOFs) have received much attention recently for the oxidation of methane to methanol. In addition, biomimetic material, particulate methane monooxygenase (pMMO), has been reported to convert methane using copper oxide clusters as active sites. Inspired by these, in this study, we considered the well-known MIL-53(Al) MOF as support for copper oxide clusters (Cu2Ox, Cu3Ox) to investigate their reactivity towards methane oxidation using Density Functional Theory (DFT) calculations. The copper oxide clusters (Cu2O2, Cu3O2) are modeled by oxidizing copper clusters (Cu2, Cu3) with two oxidizers, O2 and N2O. The initial C-H bond activation barriers on Cu2O2/MIL-53(Al) and Cu3O2/MIL-53(Al) catalysts are 0.70 eV and 0.64 eV, respectively, and are the rate-determining steps in the overall methane conversion to methanol reactions. The desorption energy of the methanol over the Cu2O/MIL-53(Al) and Cu3O/MIL-53(Al) is 0.71eV and 0.75 eV, respectively. Furthermore, to explore the prospect of catalyst reusability, we considered the different oxidants and proposed the different reaction pathways for completing the reaction cycle and regenerating the active copper oxide clusters. To know the reason for the difference between bi-copper and tri-cooper systems, we also did an electronic analysis. Finally, we calculate the Microkinetic Simulation. The result shows that the reaction can happen at room temperature.

Keywords: DFT study, copper oxide cluster, MOFs, methane conversion

Procedia PDF Downloads 58
2350 Imputing the Minimum Social Value of Public Healthcare: A General Equilibrium Model of Israel

Authors: Erez Yerushalmi, Sani Ziv

Abstract:

The rising demand for healthcare services, without a corresponding rise in public supply, led to a debate on whether to increase private healthcare provision - especially in hospital services and second-tier healthcare. Proponents for increasing private healthcare highlight gains in efficiency, while opponents its risk to social welfare. None, however, provide a measure of the social value and its impact on the economy in terms of a monetary value. In this paper, we impute a minimum social value of public healthcare that corresponds to indifference between gains in efficiency, with losses to social welfare. Our approach resembles contingent valuation methods that introduce a hypothetical market for non-commodities, but is different from them because we use numerical simulation techniques to exploit certain market failure conditions. In this paper, we develop a general equilibrium model that distinguishes between public-private healthcare services and public-private financing. Furthermore, the social value is modelled as a by product of healthcare services. The model is then calibrated to our unique health focused Social Accounting Matrix of Israel, and simulates the introduction of a hypothetical health-labour market - given that it is heavily regulated in the baseline (i.e., the true situation in Israel today). For baseline parameters, we estimate the minimum social value at around 18% public healthcare financing. The intuition is that the gain in economic welfare from improved efficiency, is offset by the loss in social welfare due to a reduction in available social value. We furthermore simulate a deregulated healthcare scenario that internalizes the imputed value of social value and searches for the optimal weight of public and private healthcare provision.

Keywords: contingent valuation method (CVM), general equilibrium model, hypothetical market, private-public healthcare, social value of public healthcare

Procedia PDF Downloads 129
2349 Modified Acetamidobenzoxazolone Based Biomarker for Translocator Protein Mapping during Neuroinflammation

Authors: Anjani Kumar Tiwari, Neelam Kumari, Anil Mishra

Abstract:

The 18-kDa translocator protein (TSPO) previously called as peripheral benzodiazepine receptor, is proven biomarker for variety of neuroinflammation. TSPO is tryptophane rich five transmembranal protein found on outer mitochondrial membrane of steroid synthesising and immunomodulatory cells. In case of neuronal damage or inflammation the expression level of TSPO get upregulated as an immunomodulatory response. By utilizing Benzoxazolone as a basic scaffold, series of TSPO ligands have been designed followed by their screening through in silico studies. Synthesis has been planned by employing convergent methodology in six high yielding steps. For the synthesized ligands the ‘in vitro’ assay was performed to determine the binding affinity in term of Ki. On ischemic rat brain, autoradiography studies were also carried to check the specificity and affinity of the designed radiolabelled ligand for TSPO.Screening was performed on the basis of GScore of CADD based schrodinger software. All the modified and better prospective compound were successfully carried out and characterized by spectroscopic techniques (FTIR, NMR and HRMS). In vitro binding assay showed best binding affinity Ki = 6.1+ 0.3 for TSPO over central benzodiazepine receptor (CBR) Ki > 200. ARG studies indicated higher uptake of two analogues on the lesion side compared with that on the non-lesion side of ischemic rat brains. Displacement experiments with unlabelled ligand had minimized the difference in uptake between the two sides which indicates the specificity of the ligand towards TSPO receptor.

Keywords: TSPO, PET, imaging, Acetamidobenzoxazolone

Procedia PDF Downloads 128
2348 Power-Sharing Politics: A Panacea to Conflict Resolution and Stability in Africa

Authors: Emmanuel Dangana Monday

Abstract:

Africa as a continent has been ravaged and bedeviled by series of political conflicts associated with politics and power-sharing maneuvering. As a result it has become the most unstable continent in the world in terms of power distribution and stable political culture. This paper examines the efficacy of conscious and deliberate power-sharing strategies to settle or resolve political conflicts in Africa in the arrangements of creation of states, revenue and resources allocation, and office distribution systems. The study is concerned with the spatial impact of conflicts generated in some renowned African countries in which power-sharing would have been a solution. Ethno-regional elite groups are identified as the major actors in the struggles for the distribution of territorial, economic and political powers in Africa. The struggle for power has become so intense that it has degenerated to conflicts and wars of inter and intra-political classes and parties respectively. Secondary data and deductive techniques were used in data collection and analysis. It is discovered that power-sharing has become an indispensable tool to curb the incessant political and power crisis in Africa. It is one of the finest tolerable modality of mediating elite’ competition, since it reflects the interests of both the dominant and the perceived marginalized groups. The study recommends that countries and regions of political, ethnic and religious differences in Africa should employed power-sharing strategy in order to avoid unnecessary political tension and resultant crisis. Interest groups should always come to the negotiation table to reach a realistic, durable and expected compromise to secure a peacefully resolute Africa.

Keywords: Africa, power-sharing, conflicts, politics and political stability

Procedia PDF Downloads 311
2347 Estimation of the Seismic Response Modification Coefficient in the Superframe Structural System

Authors: Ali Reza Ghanbarnezhad Ghazvini, Seyyed Hamid Reza Mosayyebi

Abstract:

In recent years, an earthquake has occurred approximately every five years in certain regions of Iran. To mitigate the impact of these seismic events, it is crucial to identify and thoroughly assess the vulnerability of buildings and infrastructure, ensuring their safety through principled reinforcement. By adopting new methods of risk assessment, we can effectively reduce the potential risks associated with future earthquakes. In our research, we have observed that the coefficient of behavior in the fourth chapter is 1.65 for the initial structure and 1.72 for the Superframe structure. This indicates that the Superframe structure can enhance the strength of the main structural members by approximately 10% through the utilization of super beams. Furthermore, based on the comparative analysis between the two structures conducted in this study, we have successfully designed a stronger structure with minimal changes in the coefficient of behavior. Additionally, this design has allowed for greater energy dissipation during seismic events, further enhancing the structure's resilience to earthquakes. By comprehensively examining and reinforcing the vulnerability of buildings and infrastructure, along with implementing advanced risk assessment techniques, we can significantly reduce casualties and damages caused by earthquakes in Iran. The findings of this study offer valuable insights for civil engineering professionals in the field of structural engineering, aiding them in designing safer and more resilient structures.

Keywords: modal pushover analysis, response modification factor, high-strength concrete, concrete shear walls, high-rise building

Procedia PDF Downloads 126
2346 Identification of Membrane Foulants in Direct Contact Membrane Distillation for the Treatment of Reject Brine

Authors: Shefaa Mansour, Hassan Arafat, Shadi Hasan

Abstract:

Management of reverse osmosis (RO) brine has become a major area of research due to the environmental concerns associated with it. This study worked on studying the feasibility of the direct contact membrane distillation (DCMD) system in the treatment of this RO brine. The system displayed great potential in terms of its flux and salt rejection, where different operating conditions such as the feed temperature, feed salinity, feed and permeate flow rates were varied. The highest flux of 16.7 LMH was reported with a salt rejection of 99.5%. Although the DCMD has displayed potential of enhanced water recovery from highly saline solutions, one of the major drawbacks associated with the operation is the fouling of the membranes which impairs the system performance. An operational run of 77 hours for the treatment of RO brine of 56,500 ppm salinity was performed in order to investigate the impact of fouling of the membrane on the overall operation of the system over long time operations. Over this time period, the flux was observed to have reduced by four times its initial flux. The fouled membrane was characterized through different techniques for the identification of the organic and inorganic foulants that have deposited on the membrane surface. The Infrared Spectroscopy method (IR) was used to identify the organic foulants where SEM images displayed the surface characteristics of the membrane. As for the inorganic foulants, they were identified using X-ray Diffraction (XRD), Ion Chromatography (IC) and Energy Dispersive Spectroscopy (EDS). The major foulants found on the surface of the membrane were inorganic salts such as sodium chloride and calcium sulfate.

Keywords: brine treatment, membrane distillation, fouling, characterization

Procedia PDF Downloads 421
2345 Indigenous Knowledge and Archaeological Heritage Resources in Lawra, Upper West Region, Ghana

Authors: Christiana Wulty Diku

Abstract:

This research mapped and documented archaeological heritage resources with associated indigenous knowledge in Lawra, an understudied Municipality in the Upper West Region of Ghana. Since the inception of Archaeology as a discipline in the 1930s at the University of Ghana, the Lawra Municipality has rarely been investigated archaeologically. Consequently, the unconsciousness and ignorance of indigenes on the relevance of these resources to national development has destroyed many significant archaeological sites, with agriculture and infrastructural developmental activities endangering countless of them. Drawing from a community archaeological approach, a collaborative archaeological investigation between local groups, communities and professionals (archaeologists) was conducted to recover these lost histories of settlements in the municipality, salvage and protect endangered archaeological heritage resources and sites from agricultural, exploitative and developmental activities. This was geared towards expanding on the limited research on northern Ghana and deepening our understanding on the existing symbiotic relationship between people and their heritage resources in past and present times. The study deploying ethnographic, archaeological and physical survey techniques as methods in six field seasons beginning from August 2013 to April 2023. This resulted in the reconstruction of the settlement history of Lawra with chronological dates, compilation of inventory on significant archaeological heritage resources with associated indigenous knowledge, mitigation of endangered archaeological sites and heritage resources through surface collections and the development of a photographic record, with associated metadata for purposes of preservation and future research.

Keywords: archaeological heritage resources, indigenous knowledge, lawra municipality, community archaeology

Procedia PDF Downloads 55
2344 Total Longitudinal Displacement (tLoD) of the Common Carotid Artery (CCA) Does Not Differ between Patients with Moderate or High Cardiovascular Risk (CV) and Patients after Acute Myocardial Infarction (AMI)

Authors: P. Serpytis, K. Azukaitis, U. Gargalskaite, R. Navickas, J. Badariene, V. Dzenkeviciute

Abstract:

Purpose: Total longitudinal displacement (tLoD) of the common carotid artery (CCA) wall is a novel ultrasound marker of vascular function that can be evaluated using modified speckle tracking techniques. Decreased CCA tLoD has already been shown to be associated with diabetes and was shown to predict one year cardiovascular outcome in patients with suspected coronary artery disease (CAD) . The aim of our study was to evaluate if CCA tLoD differ between patients with moderate or high cardiovascular (CV) risk and patients after recent acute myocardial infarction (AMI). Methods: 49 patients (54±6 years) with moderate or high CV risk and 42 patients (58±7 years) after recent AMI were included. All patients were non-diabetic. CCA tLoD was evaluated using GE EchoPAC speckle tracking software and expressed as mean of both sides. Data on systolic blood pressure, total and high density lipoprotein (HDL) cholesterol levels, high sensitivity C-reactive protein (hsCRP) level, smoking status and family history of early CV events was evaluated and assessed for association with CCA tLoD. Results: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after MI (0.265±0.128 mm vs. 0.237±0.103 mm, p>0.05). Lower tLoD was associated with lower HDL cholesterol levels (r=0.211, p=0.04) and male sex (0.228±0.1 vs. 0.297±0.134, p=0.01). Conclusions: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after AMI. However, lower CCA tLoD was significantly associated with low HDL cholesterol levels and male sex.

Keywords: total longitudinal displacement, carotid artery, cardiovascular risk, acute myocardial infarction

Procedia PDF Downloads 374
2343 An Investigation of Customer Relationship Management of Tourism

Authors: Wanida Suwunniponth

Abstract:

This research paper aimed to developing a causal relationship model of success factors of customer relationship management of tourism in Thailand and to investigating relationships among the potential factors that facilitate the success of customer relationship management (CRM). The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. The questionnaire was used in collecting the data from 250 management staff in the hotels located within Bangkok area. Sampling techniques used in this research included cluster sampling according to the service quality and simple random sampling. The data input was analyzed by use of descriptive analysis and System Equation Model (SEM). The research findings demonstrated important factors accentuated by most respondents towards the success of CRM, which were organization, people, information technology and the process of CRM. Moreover, the customer relationship management of tourism business in Thailand was found to be successful at a very significant level. The hypothesis testing showed that the hypothesis was accepted, as the factors concerning with organization, people and information technology played an influence on the process and the success of customer relationship management, whereas the process of customer relationship management factor manipulated its success. The findings suggested that tourism business in Thailand with the implementation of customer relationship management should opt in improvement approach in terms of managerial structure, corporate culture building with customer- centralized approach accentuated, and investment of information technology and customer analysis, in order to capacitate higher efficiency of customer relationship management process that would result in customer satisfaction and retention of service.

Keywords: customer relationship management, casual relationship model, tourism, Thailand

Procedia PDF Downloads 317
2342 Study on the Effect of Weather Variables on the Spider Abundance in Two Ecological Zones of Ogun State, Nigeria

Authors: Odejayi Adedayo Olugbenga, Aina Adebisi

Abstract:

Weather variables (rainfall and temperature) affect the diversity and abundance of both fauna and flora species. This study compared the weather variables with spider abundance in two ecological zones of Ogun State, Nigeria namely Ago-iwoye (Rainforest) in the Ijebu axis and Aiyetoro (Derived Savannah) in the Yewa axis. Seven study sites chosen by Simple Random Sampling in each ecosystem were used for the study. In each sampling area, a 60 m x 120 m land area was marked and sampled, spider collection techniques were; hand picking, use of sweep netting, and Pitfall trap. Adult spiders were identified to the species level. Species richness was estimated by a non-parametric species estimator while the diversity of spider species was assessed by Simpson Diversity Index and Species Richness by One-way Analysis of Variance. Results revealed that spiders were more abundant in rainforest zones than in derived savannah ecosystems. However, the pattern of spider abundance in rainforest zone and residential areas were similar. During high temperatures, the activities of spiders tended to increase according to this study. In contrast, results showed that there was a negative correlation between rainfall and spider species abundance in addition to a negative and weak correlation between rainfall and species richness. It was concluded that heavy downpour has lethal effects on both immature and sometimes matured spiders, which could lead to the extinction of some unknown species of spiders. Tree planting should be encouraged, as this shelters the spider.

Keywords: spider, abundance, species richness, species diversity

Procedia PDF Downloads 70
2341 Exclusive Value Adding by iCenter Analytics on Transient Condition

Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata

Abstract:

During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.

Keywords: analytics, diagnostics, monitoring, turbomachinery

Procedia PDF Downloads 58
2340 A Comparative Evaluation of Stone Spout Management Systems in Heritage and Non-heritage Areas of the Kathmandu Valley, Nepal

Authors: Mira Tripathi, Ken Hughey, Hamish G. Rennie

Abstract:

Management of water resources is a major challenge throughout the world and in many long-established societies people still use traditional water harvesting and management techniques. Despite often being seen as efficient and cost effective, traditional methods are in decline or have been abandoned in many countries. Nevertheless, traditional approaches continue to be useful in some countries such as Nepal. The extent to which such traditional measures, in this case via stone spouts, may survive modernization, while fulfilling socio-cultural, tourism, and other needs is the focus of the research. The research develops an understanding of the socio-cultural, tourism and other values of stone spouts for the people of urban and peri-urban heritage and non-heritage areas of the Kathmandu Valley to help ongoing sustainable management of remaining spouts. Three research questions are addressed: the impacts of changes in social and cultural norms and values; development activities; and, the incremental and ongoing loss of traditional stone spout infrastructure. A meta-theory framework has been developed which synthesizes Institutional, Attachment, Central Place and Common Property theories, which form analytical lenses for the mixed-method research approach. From the exploration of the meta-theory approach, it was found that no spouts are in pristine condition but those in non-heritage areas are in better condition than those in heritage areas. “Utility value” is the main driver that still motivates people to conserve spouts.

Keywords: stone spouts, social and cultural norms and values, meta-theory, Kathmandu Valley

Procedia PDF Downloads 298
2339 Highly Efficient Iron Oxide-Sulfonated Graphene Oxide Catalyst for Esterification and Trans-Esterification Reactions

Authors: Reena D. Souza, Tripti Vats, Prem F. Siril

Abstract:

Esterification of free fatty acid (oleic acid) and transesterification of waste cooking oil (WCO) with ethanol over graphene oxide (GO), GO-Fe2O3, sulfonated GO (GO-SO3H), and Fe2O3/GO-SO3H catalysts were examined in the present study. Iron oxide supported graphene-based acid catalyst (Fe2O3/GO-SO3H) exhibited highest catalytic activity. GO was prepared by modified Hummer’s process. The GO-Fe2O3 nanocomposites were prepared by the addition of NaOH to a solution containing GO and FeCl3. Sulfonation was done using concentrated sulfuric acid. Transmissionelectron microscopy (TEM) and atomic force microscopy (AFM) imaging revealed the presence of Fe2O3 particles having size in the range of 50-200 nm. Crystal structure was analyzed by XRD and defect states of graphene were characterized using Raman spectroscopy. The effects of the reaction variables such as catalyst loading, ethanol to acid ratio, reaction time and temperature on the conversion of fatty acids were studied. The optimum conditions for the esterification process were molar ratio of alcohol to oleic acid at 12:1 with 5 wt% of Fe2O3/GO-SO3H at 1000C with a reaction time of 4h yielding 99% of ethyl oleate. This is because metal oxide supported solid acid catalysts have advantages of having both strong Brønsted as well as Lewis acid properties. The biodiesel obtained by transesterification of WCO was characterized by 1H NMR and Gas Chromatography techniques. XRD patterns of the recycled catalyst evidenced that the catalyst structure was unchanged up to the 5th cycle, which indicated the long life of the catalyst.

Keywords: Fe₂O₃/GO-SO₃H, Graphene Oxide, GO-Fe₂O₃, GO-SO₃H, WCO

Procedia PDF Downloads 260
2338 Dimension of Water Accessibility in the Southern Part of Niger State, Nigeria

Authors: Kudu Dangana, Pai H. Halilu, Osesienemo R. Asiribo-Sallau, Garba Inuwa Kuta

Abstract:

The study examined the determinants of household water accessibility in Southern part of Niger State, Nigeria. Data for the study was obtained from primary and secondary sources using questionnaire, interview, personal observation and documents. 1,192 questionnaires were administered; sampling techniques adopted are combination of purposive, stratified and simple random. Purposive sampling technique was used to determine sample frame; sample unit was determined using stratified sampling method and simple random technique was used in administering questionnaires. The result was analyzed within the scope of “WHO” water accessibility indicators using descriptive statistics. Major sources of water in the area are well; hand and electric pump borehole and streams. These sources account for over 90% of household’s water. Average per capita water consumption in the area is 22 liters per day, while location efficiency of facilities revealed an average of 80 people per borehole. Household water accessibility is affected mainly by the factors of distances, time spent to obtain water, low income status of the majority of respondents to access modern water infrastructure, and to a lesser extent household size. Recommendations includes, all tiers of government to intensify efforts in providing water infrastructures and existing ones through budgetary provisions, and communities should organize fund raising bazaar, so as to raise fund to improve water infrastructures in the area.

Keywords: accessibility, determined, stratified, scope

Procedia PDF Downloads 377
2337 Self-Determination among Individuals with Intellectual Disability: An Experiment

Authors: Wasim Ahmad, Bir Singh Chavan, Nazli Ahmad

Abstract:

Objectives: The present investigation is an attempt to find out the efficacy of training the special educators on promoting self-determination among individuals with intellectual disability. Methods: The study equipped the special educators with necessary skills and knowledge to train individuals with the intellectual disability for practicing self-determination. Subjects: Special educators (N=25) were selected for training on self-determination among individuals with intellectual disability. After receiving the training, (N=50) individuals with an intellectual disability were selected and intervened by the trained special educators. Tool: Self-Determination Scale for Adults with Mild Mental Retardation (SDSAMR) developed by Keshwal and Thressiakutty (2010) has been used. It’s a reliable and valid tool used by many researchers. It has 36 items distributed in five domains namely: personal management, community participation, recreation and leisure time, choice making and problem solving. Analysis: The collected data was analyzed using the statistical techniques such as t-test, ANCOVA, and Posthoc Tuckey test. Results: The findings of the study reveal that there is a significant difference at 1% level in the pre and post tests mean scores (t-15.56) of self-determination concepts among the special educators. This indicates that the training enhanced the performance of special educators on the concept of self-determination among individuals with intellectual disability. The study also reveals that the training received on transition planning by the special educators found to be effective because they were able to practice the concept by imparting and training the individuals with intellectual disability to if determined. The results show that there was a significant difference at 1% level in the pre and post tests mean scores (t-16.61) of self-determination among individuals with intellectual disability. Conclusion: To conclude it can be said that the training has a remarkable impact on the performance of the individuals with intellectual disability on self-determination.

Keywords: experiment, individuals with intellectual disability, self-determination, special educators

Procedia PDF Downloads 321
2336 Thermally Stable Nanocrystalline Aluminum Alloys Processed by Mechanical Alloying and High Frequency Induction Heat Sintering

Authors: Hany R. Ammar, Khalil A. Khalil, El-Sayed M. Sherif

Abstract:

The as-received metal powders were used to synthesis bulk nanocrystalline Al; Al-10%Cu; and Al-10%Cu-5%Ti alloys using mechanical alloying and high frequency induction heat sintering (HFIHS). The current study investigated the influence of milling time and ball-to-powder (BPR) weight ratio on the microstructural constituents and mechanical properties of the processed materials. Powder consolidation was carried out using a high frequency induction heat sintering where the processed metal powders were sintered into a dense and strong bulk material. The sintering conditions applied in this process were as follow: heating rate of 350°C/min; sintering time of 4 minutes; sintering temperature of 400°C; applied pressure of 750 Kgf/cm2 (100 MPa); cooling rate of 400°C/min and the process was carried out under vacuum of 10-3 Torr. The powders and the bulk samples were characterized using XRD and FEGSEM techniques. The mechanical properties were evaluated at various temperatures of 25°C, 100°C, 200°C, 300°C and 400°C to study the thermal stability of the processed alloys. The bulk nanocrystalline Al; Al-10%Cu; and Al-10%Cu-5%Ti alloys displayed extremely high hardness values even at elevated temperatures. The Al-10%Cu-5%Ti alloy displayed the highest hardness values at room and elevated temperatures which are related to the presence of Ti-containing phases such as Al3Ti and AlCu2Ti, these phases are thermally stable and retain the high hardness values at elevated temperatures up to 400ºC.

Keywords: nanocrystalline aluminum alloys, mechanical alloying, hardness, elevated temperatures

Procedia PDF Downloads 441
2335 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 176
2334 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations

Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar

Abstract:

Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.

Keywords: cache behaviour, network-on-chip, performance profiling, vectorization

Procedia PDF Downloads 180
2333 Salicylic Acid Signalling in Relation to Root Colonization in Rice

Authors: Seema Garcha, Sheetal Chopra, Navraj Sarao

Abstract:

Plant hormones play a role in internal colonization by beneficial microbes and also systemic acquired resistance. They define qualitative and quantitative nature of root microbiome and also influence dynamics of root rhizospheric soil. The present study is an attempt to relate salicylic acid (signal molecule) content and qualitative nature of root endophytes at various stages in the growth of rice varieties of commercial value- Parmal 121 and Basmati 1121. Root seedlings of these varieties were raised using tissue culture techniques and then they were transplanted in the fields. Cultivation was done using conventional methods in agriculture. Field soil contained 0.39% N, 75.12 Kg/hectare of phosphorus and 163.0 Kg/hectare of potassium. Microfloral profiling of the root tissue was done using the selective microbiological medium. The salicylic acid content was estimated using HPLC-Agilent 1100 HPLC Series. Salicylic acid level of Basmati 1121 remained relatively low at the time of transplant and 90 days after transplant. It increased marginally at 60 days. A similar trend was observed with Parmal 121 as well. However, Parmal variety recorded 0.935 ug/g of salicylic acid at 60 days after transplant. Salicylic acid content decreased after 90 days as both the rice varieties remained disease free. The endophytic root microflora was established by 60 days after transplant in both the varieties after which their population became constant. Rhizobium spp dominated over Azotobacter spp. Genetic profiling of endophytes for nitrogen-fixing ability is underway.

Keywords: plant-microbe interaction, rice, root microbiome, salicylic acid

Procedia PDF Downloads 186