Search results for: hybrid ventilation techniques
2385 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence
Authors: Mofizul Islam Awwal
Abstract:
Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence
Procedia PDF Downloads 3752384 Beating Heart Coronary Artery Bypass Grafting on Intermittent Pump Support
Authors: Sushil Kumar Singh, Vivek Tewarson, Sarvesh Kumar, Shobhit Kumar
Abstract:
Objective: ‘Beating Heart coronary artery bypass grafting on Intermittent Pump Support’ is a more reliable method of coronary revascularization that takes advantage of off and on-pump CABG while eliminating the disadvantage of both techniques. Methods: From January 2015 to December 2021, a new technique, “Intermittent On pump beating heart CABG” using a suction stabilizer was used by putting aortic and venous cannulas electively in all the patients. Patients were supported by a pump intermittently, as and when required (Group 1, n=254). Retrospective data were collected from our record of the patients who underwent off-pump CABG electively by the same surgeon and team (Group 2, n=254). Results: Significant advantage was noted in Group 1 patients in terms of the number of grafts (3.31 ± 1.16 vs. 2.30 ±0.66), grafting of lateral vessels (316 vs.202), mean operating time (1.37 ± 0.23 hrs vs. 2.22 ± 0.45 hrs) and postoperative blood loss (406.30 ± 257.90 ml vs. 567.41 ± 265.20 ml).CPB support time was less than 15 minutes in the majority of patients (n=179, 70.37 %), with a mean of 16.81 minutes. It was required, particularly during the grafting of lateral vessels. A rise in enzymes level (CRP, CKMB, Trop I, and NTPro BNP) was noted in Group 1 patients. But, these did not affect the postoperative course in patients. There was no mortality in Group 1 patients, while four patients in Group 2 died. Coclusions: Intermittent on-pump CABG technique is a promising method of surgical revascularization for all patients requiring CABG. It has shown its superiority in terms of safety, the number of grafts, operating time, and better perioperative course.Keywords: cardiopulmonary bypass, CABG, beating heart CABG, on-pump CABG
Procedia PDF Downloads 1202383 Selective Oxidation of 6Mn-2Si Advanced High Strength Steels during Intercritical Annealing Treatment
Authors: Maedeh Pourmajidian, Joseph R. McDermid
Abstract:
Advanced High Strength Steels are revolutionizing both the steel and automotive industries due to their high specific strength and ability to absorb energy during crash events. This allows manufacturers to design vehicles with significantly increased fuel efficiency without compromising passenger safety. To maintain the structural integrity of the fabricated parts, they must be protected from corrosion damage through continuous hot-dip galvanizing process, which is challenging due to selective oxidation of Mn and Si on the surface of this AHSSs. The effects of process atmosphere oxygen partial pressure and small additions of Sn on the selective oxidation of a medium-Mn C-6Mn-2Si advanced high strength steel was investigated. Intercritical annealing heat treatments were carried out at 690˚C in an N2-5%H2 process atmosphere under dew points ranging from –50˚C to +5˚C. Surface oxide chemistries, morphologies, and thicknesses were determined at a variety of length scales by several techniques, including SEM, TEM+EELS, and XPS. TEM observations of the sample cross-sections revealed the transition to internal oxidation at the +5˚C dew point. EELS results suggested that the internal oxides network was composed of a multi-layer oxide structure with varying chemistry from oxide core towards the outer part. The combined effect of employing a known surface active element as a function of process atmosphere on the surface structure development and the possible impact on reactive wetting of the steel substrates by the continuous galvanizing zinc bath will be discussed.Keywords: 3G AHSS, hot-dip galvanizing, oxygen partial pressure, selective oxidation
Procedia PDF Downloads 3982382 Impact of Television on the Coverage of Lassa Fever Disease in Nigeria
Authors: H. Shola Adeosun, F. Ajoke Adebiyi
Abstract:
This study appraises the impact of television on the coverage of Lassa Fever disease. The objectives of the study are to find out whether television is an effective tool for raising awareness about Lassa fever shapes the perception of members of the public. The research work was based on the theoretical foundation of Agenda – setting and reinforcement theory. Survey research method was adopted in the study to elicit data from the residents of Obafemi Owode Local Government, area of Ogun state. Questionnaire and oral interview were adopted as a tool for data gathering. Simple random sampling techniques were used to draw a sample for this study. Out of filled 400 questionnaires distributed to the respondents. 37 of them were incorrectly filled and returned at the stipulated time. This is about (92.5% Tables, percentages, and figures were used to analyse and interpret the data and hypothesis formulation for this study revealed that Lassa fever diseases with higher media coverage were considered more serious and more representative of a disease and estimated to have lower incidents, than diseases less frequently found in the media. Thus, 92% of the respondents agree that they have access to television coverage of Lassa fever disease led to exaggerated perceptions of personal vulnerability. It, therefore, concludes that there is a need for relevant stakeholders to ensure better community health education and improved housing conditions in southwestern Nigeria, with an emphasis on slum areas and that Nigeria need to focus on the immediate response, while preparing for the future because a society or community is all about the people who inhabit. Therefore every effort must be geared towards their society and survival.Keywords: impact, television, coverage, Lassa fever disease
Procedia PDF Downloads 2122381 Depositional Environment and Source Potential of Devonian Source Rock, Ghadames Basin, Southern Tunisia
Authors: S. Mahmoudi, A. Belhaj Mohamed, M. Saidi, F. Rezgui
Abstract:
Depositional environment and source potential of the different organic rich levels of Devonian age (up to 990m thick) from the onshore EC-1 well (Southern Tunisia) were investigated using different geochemical techniques (Rock-Eval pyrolysis, GC-MS) of over than 130 cutting samples. The obtained results including Rock Eval Pyrolysis data and biomarker distribution (terpanes, steranes and aromatics) have been used to describe the depositional environment and to assess the thermal maturity of the Devonian organic matter. These results show that the Emsian deposits exhibit poor to fair TOC contents. The associated organic matter is composed of mixed kerogen (type II/III), as indicated by the predominance of C29 steranes over C27 and C28 homologous, that was deposited in a slightly reduced environment favoring organic matter preservation. Thermal maturity assessed from Tmax, TNR and MPI-1 values shows a mature stage of organic matter. The Middle Devonian (Eifelian) shales are rich in type II organic matter that was deposited in an open marine depositional environment. The TOC values are high and vary between 2 and 7 % indicating good to excellent source rock. The relatively high IH values (reaching 547 mg HC/g TOC) and the low values of t19/t23 ratio (down to 0.2) confirm the marine origin of the organic matter (type II). During the Upper Devonian, the organic matter was deposited under variable redox conditions, oxic to suboxic which is clearly indicated by the low C35/C34 hopanes ratio, immature to marginally mature with the vitrinite reflectance ranging from 0.5 to 0.7 Ro and Tmax value of 426°C-436 °C and the TOC values range between 0.8% to 4%.Keywords: biomarker, depositional environment, devonian, source rock
Procedia PDF Downloads 4742380 Asymptotic Analysis of the Viscous Flow through a Pipe and the Derivation of the Darcy-Weisbach Law
Authors: Eduard Marusic-Paloka
Abstract:
The Darcy-Weisbach formula is used to compute the pressure drop of the fluid in the pipe, due to the friction against the wall. Because of its simplicity, the Darcy-Weisbach formula became widely accepted by engineers and is used for laminar as well as the turbulent flows through pipes, once the method to compute the mysterious friction coefficient was derived. Particularly in the second half of the 20th century. Formula is empiric, and our goal is to derive it from the basic conservation law, via rigorous asymptotic analysis. We consider the case of the laminar flow but with significant Reynolds number. In case of the perfectly smooth pipe, the situation is trivial, as the Navier-Stokes system can be solved explicitly via the Poiseuille formula leading to the friction coefficient in the form 64/Re. For the rough pipe, the situation is more complicated and some effects of the roughness appear in the friction coefficient. We start from the Navier-Stokes system in the pipe with periodically corrugated wall and derive an asymptotic expansion for the pressure and for the velocity. We use the homogenization techniques and the boundary layer analysis. The approximation derived by formal analysis is then justified by rigorous error estimate in the norm of the appropriate Sobolev space, using the energy formulation and classical a priori estimates for the Navier-Stokes system. Our method leads to the formula for the friction coefficient. The formula involves resolution of the appropriate boundary layer problems, namely the boundary value problems for the Stokes system in an infinite band, that needs to be done numerically. However, theoretical analysis characterising their nature can be done without solving them.Keywords: Darcy-Weisbach law, pipe flow, rough boundary, Navier law
Procedia PDF Downloads 3532379 Analysis of Long-term Results After External Dacryocystorhinostomy Surgery in Patients Suffered from Diabetes Mellitus
Authors: N. Musayeva, N. Rustamova, N. Bagirov, S. Ibadov
Abstract:
Purpose: to analyze the long-term results of external dacryocystorhinostomy (DCR), which remains the preferred primary procedure in the surgical treatment of lacrimal duct obstruction in chronic dacryocystitis. Methodology: long-term results of external DCR (after 3 years) performed on 90 patients (90 eyes) with chronic dacryocystitis from 2018 to 2020 were evaluated. The Azerbaijan National Center of Ophthalmology, named after acad. Zarifa Aliyeva. 15 of the patients were men, 75 – women. The average age was 45±3.2 years. Surgical operations were performed under local anesthesia. All patients suffered from diabetes mellitus for more than 3 years. All patients underwent external DCR and silicone drainage (tube) was implanted. In the postoperative period (after 3 years), lacrimation, purulent discharge, and the condition of the scar at the operation site were assessed. Results: All patients were under observation for more than 18 months. In general, the effectiveness of the surgical operation was 93.34%. Recurrence of disease was observed in 6 patients and in 3 patients (3.33%), the scar at the site of the operation was rough (non-cosmetic). In 3 patients (3.33%) – the surgically formed anastomosis between the lacrimal sac and the nasal bone was obstructed by scar tissue. These patients were reoperated by trans canalicular laser DCR. Conclusion: Despite the long-term (more than a hundred years) use of external DCR, it remains one of the primary techniques in the surgery of chronic dacryocystitis. Due to the high success rate and good long-term results of DCR in the treatment of chronic dacryocystitis in patients suffering from diabetes mellitus, we recommend external DCR for this group of patients.Keywords: chronic dacryocystitis, diabetes mellitus, external dacryocystorhinostomy, long-term results
Procedia PDF Downloads 682378 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques
Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba
Abstract:
The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry
Procedia PDF Downloads 1902377 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons
Authors: Ozgu Hafizoglu
Abstract:
Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.Keywords: analogy, analogical reasoning, cognitive model, brain and glials
Procedia PDF Downloads 1852376 Relay-Augmented Bottleneck Throughput Maximization for Correlated Data Routing: A Game Theoretic Perspective
Authors: Isra Elfatih Salih Edrees, Mehmet Serdar Ufuk Türeli
Abstract:
In this paper, an energy-aware method is presented, integrating energy-efficient relay-augmented techniques for correlated data routing with the goal of optimizing bottleneck throughput in wireless sensor networks. The system tackles the dual challenge of throughput optimization while considering sensor network energy consumption. A unique routing metric has been developed to enable throughput maximization while minimizing energy consumption by utilizing data correlation patterns. The paper introduces a game theoretic framework to address the NP-complete optimization problem inherent in throughput-maximizing correlation-aware routing with energy limitations. By creating an algorithm that blends energy-aware route selection strategies with the best reaction dynamics, this framework provides a local solution. The suggested technique considerably raises the bottleneck throughput for each source in the network while reducing energy consumption by choosing the best routes that strike a compromise between throughput enhancement and energy efficiency. Extensive numerical analyses verify the efficiency of the method. The outcomes demonstrate the significant decrease in energy consumption attained by the energy-efficient relay-augmented bottleneck throughput maximization technique, in addition to confirming the anticipated throughput benefits.Keywords: correlated data aggregation, energy efficiency, game theory, relay-augmented routing, throughput maximization, wireless sensor networks
Procedia PDF Downloads 822375 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI
Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De
Abstract:
Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.Keywords: aquaculture farms, LULC, Mangrove, NDVI
Procedia PDF Downloads 1822374 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 4772373 Rehabilitation and Conservation of Mangrove Forest as Pertamina Corporate Social Responsibility Approach in Prevention Damage Climate in Indonesia
Authors: Nor Anisa
Abstract:
This paper aims to describe the use of conservation and rehabilitation of Mangrove forests as an alternative area in protecting the natural environment and ecosystems and ecology, community education and innovation of sustainable industrial development such as oil companies, gas and coal. The existence of globalization encourages energy needs such as gas, diesel and coal as an unaffected resource which is a basic need for human life while environmental degradation and natural phenomena continue to occur in Indonesia, especially global warming, sea water pollution, extinction of animal steps. The phenomenon or damage to nature in Indonesia is caused by a population explosion in Indonesia that causes unemployment, the land where the residence will disappear so that this will encourage the exploitation of nature and the environment. Therefore, Pertamina as a state-owned oil and gas company carries out its social responsibility efforts, namely to carry out conservation and rehabilitation and management of Mangrove fruit seeds which will provide an educational effect on the benefits of Mangrove seed maintenance. The method used in this study is a qualitative method and secondary data retrieval techniques where data is taken based on Pertamina activity journals and websites that can be accounted for. So the conclusion of this paper is: the benefits and function of conservation of mangrove forests in Indonesia physically, chemically, biologically and socially and economically and can provide innovation to the CSR (Corporate Social Responsibility) of the company in continuing social responsibility in the scope of environmental conservation and social education.Keywords: mangrove, environmental damage, conservation and rehabilitation, innovation of corporate social responsibility
Procedia PDF Downloads 1352372 Generative Design of Acoustical Diffuser and Absorber Elements Using Large-Scale Additive Manufacturing
Authors: Saqib Aziz, Brad Alexander, Christoph Gengnagel, Stefan Weinzierl
Abstract:
This paper explores a generative design, simulation, and optimization workflow for the integration of acoustical diffuser and/or absorber geometry with embedded coupled Helmholtz-resonators for full-scale 3D printed building components. Large-scale additive manufacturing in conjunction with algorithmic CAD design tools enables a vast amount of control when creating geometry. This is advantageous regarding the increasing demands of comfort standards for indoor spaces and the use of more resourceful and sustainable construction methods and materials. The presented methodology highlights these new technological advancements and offers a multimodal and integrative design solution with the potential for an immediate application in the AEC-Industry. In principle, the methodology can be applied to a wide range of structural elements that can be manufactured by additive manufacturing processes. The current paper focuses on a case study of an application for a biaxial load-bearing beam grillage made of reinforced concrete, which allows for a variety of applications through the combination of additive prefabricated semi-finished parts and in-situ concrete supplementation. The semi-prefabricated parts or formwork bodies form the basic framework of the supporting structure and at the same time have acoustic absorption and diffusion properties that are precisely acoustically programmed for the space underneath the structure. To this end, a hybrid validation strategy is being explored using a digital and cross-platform simulation environment, verified with physical prototyping. The iterative workflow starts with the generation of a parametric design model for the acoustical geometry using the algorithmic visual scripting editor Grasshopper3D inside the building information modeling (BIM) software Revit. Various geometric attributes (i.e., bottleneck and cavity dimensions) of the resonator are parameterized and fed to a numerical optimization algorithm which can modify the geometry with the goal of increasing absorption at resonance and increasing the bandwidth of the effective absorption range. Using Rhino.Inside and LiveLink for Revit, the generative model was imported directly into the Multiphysics simulation environment COMSOL. The geometry was further modified and prepared for simulation in a semi-automated process. The incident and scattered pressure fields were simulated from which the surface normal absorption coefficients were calculated. This reciprocal process was repeated to further optimize the geometric parameters. Subsequently the numerical models were compared to a set of 3D concrete printed physical twin models, which were tested in a .25 m x .25 m impedance tube. The empirical results served to improve the starting parameter settings of the initial numerical model. The geometry resulting from the numerical optimization was finally returned to grasshopper for further implementation in an interdisciplinary study.Keywords: acoustical design, additive manufacturing, computational design, multimodal optimization
Procedia PDF Downloads 1592371 Synthesis of Zeolites from Bauxite and Kaolin: Effect of Synthesis Parameters on Competing Phases
Authors: Bright Kwakye-Awuah, Elizabeth Von-Kiti, Isaac Nkrumah, Baah Sefa-Ntiri, Craig D. Williams
Abstract:
Bauxite and kaolin from Ghana Bauxite Company mine site were used to synthesize zeolites. Bauxite served as the alumina source and kaolin the silica source. Synthesis variations include variation of aging time at constant crystallization time and variation of crystallization times at constant aging time. Characterization techniques such as X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX) and Fourier transform infrared spectroscopy (FTIR) were employed in the characterization of the raw samples as well as the synthesized samples. The results obtained showed that the transformations that occurred and the phase of the resulting products were coordinated by the aging time, crystallization time, alkaline concentration and Si/Al ratio of the system. Zeolites A, X, Y, analcime, Sodalite, and ZK-14 were some of the phases achieved. Zeolite LTA was achieved with short crystallization times of 3, 5, 18 and 24 hours and a maximum aging of 24 hours. Zeolite LSX was synthesized with 24 hr aging followed with 24 hr hydrothermal treatment whilst zeolite Y crystallized after 48 hr of aging and 24 hr crystallization. Prolonged crystallization time produced a mixed phased product. Prolonged aging times, on the other hand, did not yield any zeolite as the sample was amorphous. Increasing the alkaline content of the reaction mixture above 5M introduced sodalite phase in the final product. The properties of the final products were comparable to zeolites synthesized from pure chemical reagents.Keywords: bauxite, kaolin, aging, crystallization, zeolites
Procedia PDF Downloads 2202370 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion
Authors: Ali Kazemi
Abstract:
Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting
Procedia PDF Downloads 662369 Methane Oxidation to Methanol Catalyzed by Copper Oxide Clusters Supported in MIL-53(Al): A Density Functional Theory Study
Authors: Chun-Wei Yeh, Santhanamoorthi Nachimuthu, Jyh-Chiang Jiang
Abstract:
Reducing greenhouse gases or converting them into fuels and chemicals with added value is vital for the environment. Given the enhanced techniques for hydrocarbon extraction in this context, the catalytic conversion of methane to methanol is particularly intriguing for future applications as vehicle fuels and/or bulk chemicals. Metal-organic frameworks (MOFs) have received much attention recently for the oxidation of methane to methanol. In addition, biomimetic material, particulate methane monooxygenase (pMMO), has been reported to convert methane using copper oxide clusters as active sites. Inspired by these, in this study, we considered the well-known MIL-53(Al) MOF as support for copper oxide clusters (Cu2Ox, Cu3Ox) to investigate their reactivity towards methane oxidation using Density Functional Theory (DFT) calculations. The copper oxide clusters (Cu2O2, Cu3O2) are modeled by oxidizing copper clusters (Cu2, Cu3) with two oxidizers, O2 and N2O. The initial C-H bond activation barriers on Cu2O2/MIL-53(Al) and Cu3O2/MIL-53(Al) catalysts are 0.70 eV and 0.64 eV, respectively, and are the rate-determining steps in the overall methane conversion to methanol reactions. The desorption energy of the methanol over the Cu2O/MIL-53(Al) and Cu3O/MIL-53(Al) is 0.71eV and 0.75 eV, respectively. Furthermore, to explore the prospect of catalyst reusability, we considered the different oxidants and proposed the different reaction pathways for completing the reaction cycle and regenerating the active copper oxide clusters. To know the reason for the difference between bi-copper and tri-cooper systems, we also did an electronic analysis. Finally, we calculate the Microkinetic Simulation. The result shows that the reaction can happen at room temperature.Keywords: DFT study, copper oxide cluster, MOFs, methane conversion
Procedia PDF Downloads 792368 Imputing the Minimum Social Value of Public Healthcare: A General Equilibrium Model of Israel
Authors: Erez Yerushalmi, Sani Ziv
Abstract:
The rising demand for healthcare services, without a corresponding rise in public supply, led to a debate on whether to increase private healthcare provision - especially in hospital services and second-tier healthcare. Proponents for increasing private healthcare highlight gains in efficiency, while opponents its risk to social welfare. None, however, provide a measure of the social value and its impact on the economy in terms of a monetary value. In this paper, we impute a minimum social value of public healthcare that corresponds to indifference between gains in efficiency, with losses to social welfare. Our approach resembles contingent valuation methods that introduce a hypothetical market for non-commodities, but is different from them because we use numerical simulation techniques to exploit certain market failure conditions. In this paper, we develop a general equilibrium model that distinguishes between public-private healthcare services and public-private financing. Furthermore, the social value is modelled as a by product of healthcare services. The model is then calibrated to our unique health focused Social Accounting Matrix of Israel, and simulates the introduction of a hypothetical health-labour market - given that it is heavily regulated in the baseline (i.e., the true situation in Israel today). For baseline parameters, we estimate the minimum social value at around 18% public healthcare financing. The intuition is that the gain in economic welfare from improved efficiency, is offset by the loss in social welfare due to a reduction in available social value. We furthermore simulate a deregulated healthcare scenario that internalizes the imputed value of social value and searches for the optimal weight of public and private healthcare provision.Keywords: contingent valuation method (CVM), general equilibrium model, hypothetical market, private-public healthcare, social value of public healthcare
Procedia PDF Downloads 1462367 Ectopic Osteoinduction of Porous Composite Scaffolds Reinforced with Graphene Oxide and Hydroxyapatite Gradient Density
Authors: G. M. Vlasceanu, H. Iovu, E. Vasile, M. Ionita
Abstract:
Herein, the synthesis and characterization of chitosan-gelatin highly porous scaffold reinforced with graphene oxide, and hydroxyapatite (HAp), crosslinked with genipin was targeted. In tissue engineering, chitosan and gelatin are two of the most robust biopolymers with wide applicability due to intrinsic biocompatibility, biodegradability, low antigenicity properties, affordability, and ease of processing. HAp, per its exceptional activity in tuning cell-matrix interactions, is acknowledged for its capability of sustaining cellular proliferation by promoting bone-like native micro-media for cell adjustment. Genipin is regarded as a top class cross-linker, while graphene oxide (GO) is viewed as one of the most performant and versatile fillers. The composites with natural bone HAp/biopolymer ratio were obtained by cascading sonochemical treatments, followed by uncomplicated casting methods and by freeze-drying. Their structure was characterized by Fourier Transform Infrared Spectroscopy and X-ray Diffraction, while overall morphology was investigated by Scanning Electron Microscopy (SEM) and micro-Computer Tomography (µ-CT). Ensuing that, in vitro enzyme degradation was performed to detect the most promising compositions for the development of in vivo assays. Suitable GO dispersion was ascertained within the biopolymer mix as nanolayers specific signals lack in both FTIR and XRD spectra, and the specific spectral features of the polymers persisted with GO load enhancement. Overall, correlations between the GO induced material structuration, crystallinity variations, and chemical interaction of the compounds can be correlated with the physical features and bioactivity of each composite formulation. Moreover, the HAp distribution within follows an auspicious density gradient tuned for hybrid osseous/cartilage matter architectures, which were mirrored in the mice model tests. Hence, the synthesis route of a natural polymer blend/hydroxyapatite-graphene oxide composite material is anticipated to emerge as influential formulation in bone tissue engineering. Acknowledgement: This work was supported by the project 'Work-based learning systems using entrepreneurship grants for doctoral and post-doctoral students' (Sisteme de invatare bazate pe munca prin burse antreprenor pentru doctoranzi si postdoctoranzi) - SIMBA, SMIS code 124705 and by a grant of the National Authority for Scientific Research and Innovation, Operational Program Competitiveness Axis 1 - Section E, Program co-financed from European Regional Development Fund 'Investments for your future' under the project number 154/25.11.2016, P_37_221/2015. The nano-CT experiments were possible due to European Regional Development Fund through Competitiveness Operational Program 2014-2020, Priority axis 1, ID P_36_611, MySMIS code 107066, INOVABIOMED.Keywords: biopolymer blend, ectopic osteoinduction, graphene oxide composite, hydroxyapatite
Procedia PDF Downloads 1042366 Modified Acetamidobenzoxazolone Based Biomarker for Translocator Protein Mapping during Neuroinflammation
Authors: Anjani Kumar Tiwari, Neelam Kumari, Anil Mishra
Abstract:
The 18-kDa translocator protein (TSPO) previously called as peripheral benzodiazepine receptor, is proven biomarker for variety of neuroinflammation. TSPO is tryptophane rich five transmembranal protein found on outer mitochondrial membrane of steroid synthesising and immunomodulatory cells. In case of neuronal damage or inflammation the expression level of TSPO get upregulated as an immunomodulatory response. By utilizing Benzoxazolone as a basic scaffold, series of TSPO ligands have been designed followed by their screening through in silico studies. Synthesis has been planned by employing convergent methodology in six high yielding steps. For the synthesized ligands the ‘in vitro’ assay was performed to determine the binding affinity in term of Ki. On ischemic rat brain, autoradiography studies were also carried to check the specificity and affinity of the designed radiolabelled ligand for TSPO.Screening was performed on the basis of GScore of CADD based schrodinger software. All the modified and better prospective compound were successfully carried out and characterized by spectroscopic techniques (FTIR, NMR and HRMS). In vitro binding assay showed best binding affinity Ki = 6.1+ 0.3 for TSPO over central benzodiazepine receptor (CBR) Ki > 200. ARG studies indicated higher uptake of two analogues on the lesion side compared with that on the non-lesion side of ischemic rat brains. Displacement experiments with unlabelled ligand had minimized the difference in uptake between the two sides which indicates the specificity of the ligand towards TSPO receptor.Keywords: TSPO, PET, imaging, Acetamidobenzoxazolone
Procedia PDF Downloads 1432365 Power-Sharing Politics: A Panacea to Conflict Resolution and Stability in Africa
Authors: Emmanuel Dangana Monday
Abstract:
Africa as a continent has been ravaged and bedeviled by series of political conflicts associated with politics and power-sharing maneuvering. As a result it has become the most unstable continent in the world in terms of power distribution and stable political culture. This paper examines the efficacy of conscious and deliberate power-sharing strategies to settle or resolve political conflicts in Africa in the arrangements of creation of states, revenue and resources allocation, and office distribution systems. The study is concerned with the spatial impact of conflicts generated in some renowned African countries in which power-sharing would have been a solution. Ethno-regional elite groups are identified as the major actors in the struggles for the distribution of territorial, economic and political powers in Africa. The struggle for power has become so intense that it has degenerated to conflicts and wars of inter and intra-political classes and parties respectively. Secondary data and deductive techniques were used in data collection and analysis. It is discovered that power-sharing has become an indispensable tool to curb the incessant political and power crisis in Africa. It is one of the finest tolerable modality of mediating elite’ competition, since it reflects the interests of both the dominant and the perceived marginalized groups. The study recommends that countries and regions of political, ethnic and religious differences in Africa should employed power-sharing strategy in order to avoid unnecessary political tension and resultant crisis. Interest groups should always come to the negotiation table to reach a realistic, durable and expected compromise to secure a peacefully resolute Africa.Keywords: Africa, power-sharing, conflicts, politics and political stability
Procedia PDF Downloads 3252364 Estimation of the Seismic Response Modification Coefficient in the Superframe Structural System
Authors: Ali Reza Ghanbarnezhad Ghazvini, Seyyed Hamid Reza Mosayyebi
Abstract:
In recent years, an earthquake has occurred approximately every five years in certain regions of Iran. To mitigate the impact of these seismic events, it is crucial to identify and thoroughly assess the vulnerability of buildings and infrastructure, ensuring their safety through principled reinforcement. By adopting new methods of risk assessment, we can effectively reduce the potential risks associated with future earthquakes. In our research, we have observed that the coefficient of behavior in the fourth chapter is 1.65 for the initial structure and 1.72 for the Superframe structure. This indicates that the Superframe structure can enhance the strength of the main structural members by approximately 10% through the utilization of super beams. Furthermore, based on the comparative analysis between the two structures conducted in this study, we have successfully designed a stronger structure with minimal changes in the coefficient of behavior. Additionally, this design has allowed for greater energy dissipation during seismic events, further enhancing the structure's resilience to earthquakes. By comprehensively examining and reinforcing the vulnerability of buildings and infrastructure, along with implementing advanced risk assessment techniques, we can significantly reduce casualties and damages caused by earthquakes in Iran. The findings of this study offer valuable insights for civil engineering professionals in the field of structural engineering, aiding them in designing safer and more resilient structures.Keywords: modal pushover analysis, response modification factor, high-strength concrete, concrete shear walls, high-rise building
Procedia PDF Downloads 1422363 Identification of Membrane Foulants in Direct Contact Membrane Distillation for the Treatment of Reject Brine
Authors: Shefaa Mansour, Hassan Arafat, Shadi Hasan
Abstract:
Management of reverse osmosis (RO) brine has become a major area of research due to the environmental concerns associated with it. This study worked on studying the feasibility of the direct contact membrane distillation (DCMD) system in the treatment of this RO brine. The system displayed great potential in terms of its flux and salt rejection, where different operating conditions such as the feed temperature, feed salinity, feed and permeate flow rates were varied. The highest flux of 16.7 LMH was reported with a salt rejection of 99.5%. Although the DCMD has displayed potential of enhanced water recovery from highly saline solutions, one of the major drawbacks associated with the operation is the fouling of the membranes which impairs the system performance. An operational run of 77 hours for the treatment of RO brine of 56,500 ppm salinity was performed in order to investigate the impact of fouling of the membrane on the overall operation of the system over long time operations. Over this time period, the flux was observed to have reduced by four times its initial flux. The fouled membrane was characterized through different techniques for the identification of the organic and inorganic foulants that have deposited on the membrane surface. The Infrared Spectroscopy method (IR) was used to identify the organic foulants where SEM images displayed the surface characteristics of the membrane. As for the inorganic foulants, they were identified using X-ray Diffraction (XRD), Ion Chromatography (IC) and Energy Dispersive Spectroscopy (EDS). The major foulants found on the surface of the membrane were inorganic salts such as sodium chloride and calcium sulfate.Keywords: brine treatment, membrane distillation, fouling, characterization
Procedia PDF Downloads 4362362 Indigenous Knowledge and Archaeological Heritage Resources in Lawra, Upper West Region, Ghana
Authors: Christiana Wulty Diku
Abstract:
This research mapped and documented archaeological heritage resources with associated indigenous knowledge in Lawra, an understudied Municipality in the Upper West Region of Ghana. Since the inception of Archaeology as a discipline in the 1930s at the University of Ghana, the Lawra Municipality has rarely been investigated archaeologically. Consequently, the unconsciousness and ignorance of indigenes on the relevance of these resources to national development has destroyed many significant archaeological sites, with agriculture and infrastructural developmental activities endangering countless of them. Drawing from a community archaeological approach, a collaborative archaeological investigation between local groups, communities and professionals (archaeologists) was conducted to recover these lost histories of settlements in the municipality, salvage and protect endangered archaeological heritage resources and sites from agricultural, exploitative and developmental activities. This was geared towards expanding on the limited research on northern Ghana and deepening our understanding on the existing symbiotic relationship between people and their heritage resources in past and present times. The study deploying ethnographic, archaeological and physical survey techniques as methods in six field seasons beginning from August 2013 to April 2023. This resulted in the reconstruction of the settlement history of Lawra with chronological dates, compilation of inventory on significant archaeological heritage resources with associated indigenous knowledge, mitigation of endangered archaeological sites and heritage resources through surface collections and the development of a photographic record, with associated metadata for purposes of preservation and future research.Keywords: archaeological heritage resources, indigenous knowledge, lawra municipality, community archaeology
Procedia PDF Downloads 692361 Total Longitudinal Displacement (tLoD) of the Common Carotid Artery (CCA) Does Not Differ between Patients with Moderate or High Cardiovascular Risk (CV) and Patients after Acute Myocardial Infarction (AMI)
Authors: P. Serpytis, K. Azukaitis, U. Gargalskaite, R. Navickas, J. Badariene, V. Dzenkeviciute
Abstract:
Purpose: Total longitudinal displacement (tLoD) of the common carotid artery (CCA) wall is a novel ultrasound marker of vascular function that can be evaluated using modified speckle tracking techniques. Decreased CCA tLoD has already been shown to be associated with diabetes and was shown to predict one year cardiovascular outcome in patients with suspected coronary artery disease (CAD) . The aim of our study was to evaluate if CCA tLoD differ between patients with moderate or high cardiovascular (CV) risk and patients after recent acute myocardial infarction (AMI). Methods: 49 patients (54±6 years) with moderate or high CV risk and 42 patients (58±7 years) after recent AMI were included. All patients were non-diabetic. CCA tLoD was evaluated using GE EchoPAC speckle tracking software and expressed as mean of both sides. Data on systolic blood pressure, total and high density lipoprotein (HDL) cholesterol levels, high sensitivity C-reactive protein (hsCRP) level, smoking status and family history of early CV events was evaluated and assessed for association with CCA tLoD. Results: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after MI (0.265±0.128 mm vs. 0.237±0.103 mm, p>0.05). Lower tLoD was associated with lower HDL cholesterol levels (r=0.211, p=0.04) and male sex (0.228±0.1 vs. 0.297±0.134, p=0.01). Conclusions: tLoD of CCA did not differ between patients with moderate or high CV risk and patients with very high CV risk after AMI. However, lower CCA tLoD was significantly associated with low HDL cholesterol levels and male sex.Keywords: total longitudinal displacement, carotid artery, cardiovascular risk, acute myocardial infarction
Procedia PDF Downloads 3842360 An Investigation of Customer Relationship Management of Tourism
Authors: Wanida Suwunniponth
Abstract:
This research paper aimed to developing a causal relationship model of success factors of customer relationship management of tourism in Thailand and to investigating relationships among the potential factors that facilitate the success of customer relationship management (CRM). The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. The questionnaire was used in collecting the data from 250 management staff in the hotels located within Bangkok area. Sampling techniques used in this research included cluster sampling according to the service quality and simple random sampling. The data input was analyzed by use of descriptive analysis and System Equation Model (SEM). The research findings demonstrated important factors accentuated by most respondents towards the success of CRM, which were organization, people, information technology and the process of CRM. Moreover, the customer relationship management of tourism business in Thailand was found to be successful at a very significant level. The hypothesis testing showed that the hypothesis was accepted, as the factors concerning with organization, people and information technology played an influence on the process and the success of customer relationship management, whereas the process of customer relationship management factor manipulated its success. The findings suggested that tourism business in Thailand with the implementation of customer relationship management should opt in improvement approach in terms of managerial structure, corporate culture building with customer- centralized approach accentuated, and investment of information technology and customer analysis, in order to capacitate higher efficiency of customer relationship management process that would result in customer satisfaction and retention of service.Keywords: customer relationship management, casual relationship model, tourism, Thailand
Procedia PDF Downloads 3302359 Study on the Effect of Weather Variables on the Spider Abundance in Two Ecological Zones of Ogun State, Nigeria
Authors: Odejayi Adedayo Olugbenga, Aina Adebisi
Abstract:
Weather variables (rainfall and temperature) affect the diversity and abundance of both fauna and flora species. This study compared the weather variables with spider abundance in two ecological zones of Ogun State, Nigeria namely Ago-iwoye (Rainforest) in the Ijebu axis and Aiyetoro (Derived Savannah) in the Yewa axis. Seven study sites chosen by Simple Random Sampling in each ecosystem were used for the study. In each sampling area, a 60 m x 120 m land area was marked and sampled, spider collection techniques were; hand picking, use of sweep netting, and Pitfall trap. Adult spiders were identified to the species level. Species richness was estimated by a non-parametric species estimator while the diversity of spider species was assessed by Simpson Diversity Index and Species Richness by One-way Analysis of Variance. Results revealed that spiders were more abundant in rainforest zones than in derived savannah ecosystems. However, the pattern of spider abundance in rainforest zone and residential areas were similar. During high temperatures, the activities of spiders tended to increase according to this study. In contrast, results showed that there was a negative correlation between rainfall and spider species abundance in addition to a negative and weak correlation between rainfall and species richness. It was concluded that heavy downpour has lethal effects on both immature and sometimes matured spiders, which could lead to the extinction of some unknown species of spiders. Tree planting should be encouraged, as this shelters the spider.Keywords: spider, abundance, species richness, species diversity
Procedia PDF Downloads 922358 Exclusive Value Adding by iCenter Analytics on Transient Condition
Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata
Abstract:
During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.Keywords: analytics, diagnostics, monitoring, turbomachinery
Procedia PDF Downloads 742357 Development of a Context Specific Planning Model for Achieving a Sustainable Urban City
Authors: Jothilakshmy Nagammal
Abstract:
This research paper deals with the different case studies, where the Form-Based Codes are adopted in general and the different implementation methods in particular are discussed to develop a method for formulating a new planning model. The organizing principle of the Form-Based Codes, the transect is used to zone the city into various context specific transects. An approach is adopted to develop the new planning model, city Specific Planning Model (CSPM), as a tool to achieve sustainability for any city in general. A case study comparison method in terms of the planning tools used, the code process adopted and the various control regulations implemented in thirty two different cities are done. The analysis shows that there are a variety of ways to implement form-based zoning concepts: Specific plans, a parallel or optional form-based code, transect-based code /smart code, required form-based standards or design guidelines. The case studies describe the positive and negative results from based zoning, Where it is implemented. From the different case studies on the method of the FBC, it is understood that the scale for formulating the Form-Based Code varies from parts of the city to the whole city. The regulating plan is prepared with the organizing principle as the transect in most of the cases. The various implementation methods adopted in these case studies for the formulation of Form-Based Codes are special districts like the Transit Oriented Development (TOD), traditional Neighbourhood Development (TND), specific plan and Street based. The implementation methods vary from mandatory, integrated and floating. To attain sustainability the research takes the approach of developing a regulating plan, using the transect as the organizing principle for the entire area of the city in general in formulating the Form-Based Codes for the selected Special Districts in the study area in specific, street based. Planning is most powerful when it is embedded in the broader context of systemic change and improvement. Systemic is best thought of as holistic, contextualized and stake holder-owned, While systematic can be thought of more as linear, generalisable, and typically top-down or expert driven. The systemic approach is a process that is based on the system theory and system design principles, which are too often ill understood by the general population and policy makers. The system theory embraces the importance of a global perspective, multiple components, interdependencies and interconnections in any system. In addition, the recognition that a change in one part of a system necessarily alters the rest of the system is a cornerstone of the system theory. The proposed regulating plan taking the transect as an organizing principle and Form-Based Codes to achieve sustainability of the city has to be a hybrid code, which is to be integrated within the existing system - A Systemic Approach with a Systematic Process. This approach of introducing a few form based zones into a conventional code could be effective in the phased replacement of an existing code. It could also be an effective way of responding to the near-term pressure of physical change in “sensitive” areas of the community. With this approach and method the new Context Specific Planning Model is created towards achieving sustainability is explained in detail this research paper.Keywords: context based planning model, form based code, transect, systemic approach
Procedia PDF Downloads 3382356 A Comparative Evaluation of Stone Spout Management Systems in Heritage and Non-heritage Areas of the Kathmandu Valley, Nepal
Authors: Mira Tripathi, Ken Hughey, Hamish G. Rennie
Abstract:
Management of water resources is a major challenge throughout the world and in many long-established societies people still use traditional water harvesting and management techniques. Despite often being seen as efficient and cost effective, traditional methods are in decline or have been abandoned in many countries. Nevertheless, traditional approaches continue to be useful in some countries such as Nepal. The extent to which such traditional measures, in this case via stone spouts, may survive modernization, while fulfilling socio-cultural, tourism, and other needs is the focus of the research. The research develops an understanding of the socio-cultural, tourism and other values of stone spouts for the people of urban and peri-urban heritage and non-heritage areas of the Kathmandu Valley to help ongoing sustainable management of remaining spouts. Three research questions are addressed: the impacts of changes in social and cultural norms and values; development activities; and, the incremental and ongoing loss of traditional stone spout infrastructure. A meta-theory framework has been developed which synthesizes Institutional, Attachment, Central Place and Common Property theories, which form analytical lenses for the mixed-method research approach. From the exploration of the meta-theory approach, it was found that no spouts are in pristine condition but those in non-heritage areas are in better condition than those in heritage areas. “Utility value” is the main driver that still motivates people to conserve spouts.Keywords: stone spouts, social and cultural norms and values, meta-theory, Kathmandu Valley
Procedia PDF Downloads 311