Search results for: long term peak demand forecasting
523 Multi-Plane Wrist Movement: Pathomechanics and Design of a 3D-Printed Splint
Authors: Sigal Portnoy, Yael Kaufman-Cohen, Yafa Levanon
Abstract:
Introduction: Rehabilitation following wrist fractures often includes exercising flexion-extension movements with a dynamic splint. However, during daily activities, we combine most of our wrist movements with radial and ulnar deviations. Also, the multi-plane wrist motion, named the ‘dart throw motion’ (DTM), was found to be a more stable motion in healthy individuals, in term of the motion of the proximal carpal bones, compared with sagittal wrist motion. The aim of this study was therefore to explore the pathomechanics of the wrist in a common multi-plane movement pattern (DTM) and design a novel splint for rehabilitation following distal radius fractures. Methods: First, a multi-axis electro-goniometer was used to quantify the plane angle of motion of the dominant and non-dominant wrists during various activities, e.g. drinking from a glass of water and answering a phone in 43 healthy individuals. The following protocols were then implemented with a population following distal radius fracture. Two dynamic scans were performed, one of the sagittal wrist motion and DTM, in a 3T magnetic resonance imaging (MRI) device, bilaterally. The scaphoid and lunate carpal bones, as well as the surface of the distal radius, were manually-segmented in SolidWorks and the angles of motion of the scaphoid and lunate bones were calculated. Subsequently, a patient-specific splint was designed using 3D scans of the hand. The brace design comprises of a proximal attachment to the arm and a distal envelope of the palm. An axle with two wheels is attached to the proximal part. Two wires attach the proximal part with the medial-palmar and lateral-ventral aspects of the distal part: when the wrist extends, the first wire is released and the second wire is strained towards the radius. The opposite occurs when the wrist flexes. The splint was attached to the wrist using Velcro and constrained the wrist movement to the desired calculated multi-plane of motion. Results: No significant differences were found between the multi-plane angles of the dominant and non-dominant wrists. The most common daily activities occurred at a plane angle of approximately 20° to 45° from the sagittal plane and the MRI studies show individual angles of the plane of motion. The printed splint fitted the wrist of the subjects and constricted movement to the desired multi-plane of motion. Hooks were inserted on each part to allow the addition of springs or rubber bands for resistance training towards muscle strengthening in the rehabilitation setting. Conclusions: It has been hypothesized that activation of the wrist in a multi-plane movement pattern following distal radius fractures will accelerate the recovery of the patient. Our results show that this motion can be determined from either the dominant or non-dominant wrists. The design of the patient-specific dynamic splint is the first step towards assessing whether splinting to induce combined movement is beneficial to the rehabilitation process, compared to conventional treatment. The evaluation of the clinical benefits of this method, compared to conventional rehabilitation methods following wrist fracture, are a part of a PhD work, currently conducted by an occupational therapist.Keywords: distal radius fracture, rehabilitation, dynamic magnetic resonance imaging, dart throw motion
Procedia PDF Downloads 299522 RE:SOUNDING a 2000-Year-Old Vietnamese Dong Son Bronze Drum; Artist-Led Collaborations outside the Museum to Challenge the Impasse of Repatriating and Rematriating Cultural Instruments
Authors: H. A. J. Nguyen, V. A. Pham
Abstract:
RE:SOUNDING is an ongoing research project and artwork seeking to return the sound and knowledge of Dong Son bronze drums back to contemporary musicians. Colonial collections of ethnographic instruments are problematic in how they commit acts of conceptual, cultural, and acoustic silencing. The collection (or more honestly), the plagiarism, and pillaging of these instruments have systemically separated them from living and breathing cultures. This includes diasporic communities, who have come to resettle in close proximity - but still have little access - to the museums and galleries that display their cultural objects. Despite recent attempts to 'open up' and 'recognise' the tensions and violence of these ethnographic collections, many museums continue to structurally organize and reproduce knowledge with the same procedural distance and limitations of imperial condescension. Impatient with the slowness of these museums, our diaspora led collaborations participated in the opaque economy of the auction market to gain access and begin the process of digitally recording and archiving the actual sounds of the ancient Dong Son drum. This self-directed, self-initiated artwork not only acoustically reinvigorated an ancient instrument but redistributed these sonic materials back to contemporary musicians, composers, and their diasporic communities throughout Vietnam, South East Asia, and Australia. Our methodologies not only highlight the persistent inflexibility of museum infrastructures but demand that museums refrain from their paternalistic practice of risk-averse ownership, to seriously engage with new technologies and political formations that require all public institutions to be held accountable for the ethical and intellectual viability of their colonial collections. The integrated and practical resolve of diasporic artists and their communities are more than capable of working with new technologies to reclaim and reinvigorate what is culturally and spiritually theirs. The motivation to rematriate – as opposed to merely repatriate – the acoustic legacies of these instruments to contemporary musicians and artists is a new model for decolonial and restorative practices. Exposing the inadequacies of western scholarship that continues to treat these instruments as discreet, disembodied, and detached artifacts, these collaborative strategies have thus far produced a wealth of new knowledge – new to the west perhaps – but not that new to these, our own communities. This includes the little-acknowledged fact that the Dong Son drum were political instruments of war and technology, rather than their simplistic description in the museum and western academia as agrarian instruments of fertility and harvest. Through the collective and continued sharing of knowledge and sound materials produced from this research, these drums are gaining a contemporary relevance beyond the cultural silencing of the museum display cabinet. Acknowledgement: We acknowledge the Wurundjeri and Boon Wurrung of the Kulin Nation and the Gadigal of the Eora Nation where we began this project. We pay our respects to the Peoples, Lands, Traditional Custodians, Practices, and Creator Ancestors of these Great Nations, as well as those First Nations peoples throughout Australia, Vietnam, and Indonesia, where this research continues, and upon whose stolen lands and waterways were never ceded.Keywords: acoustic archaeology, decolonisation, museum collections, rematriation, repatriation, Dong Son, experimental music, digital recording
Procedia PDF Downloads 151521 Enhancing Athlete Training using Real Time Pose Estimation with Neural Networks
Authors: Jeh Patel, Chandrahas Paidi, Ahmed Hambaba
Abstract:
Traditional methods for analyzing athlete movement often lack the detail and immediacy required for optimal training. This project aims to address this limitation by developing a Real-time human pose estimation system specifically designed to enhance athlete training across various sports. This system leverages the power of convolutional neural networks (CNNs) to provide a comprehensive and immediate analysis of an athlete’s movement patterns during training sessions. The core architecture utilizes dilated convolutions to capture crucial long-range dependencies within video frames. Combining this with the robust encoder-decoder architecture to further refine pose estimation accuracy. This capability is essential for precise joint localization across the diverse range of athletic poses encountered in different sports. Furthermore, by quantifying movement efficiency, power output, and range of motion, the system provides data-driven insights that can be used to optimize training programs. Pose estimation data analysis can also be used to develop personalized training plans that target specific weaknesses identified in an athlete’s movement patterns. To overcome the limitations posed by outdoor environments, the project employs strategies such as multi-camera configurations or depth sensing techniques. These approaches can enhance pose estimation accuracy in challenging lighting and occlusion scenarios, where pose estimation accuracy in challenging lighting and occlusion scenarios. A dataset is collected From the labs of Martin Luther King at San Jose State University. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing different poses, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced pose detection model and lays the groundwork for future innovations in assistive enhancement technologies.Keywords: computer vision, deep learning, human pose estimation, U-NET, CNN
Procedia PDF Downloads 55520 Analysis of Reduced Mechanisms for Premixed Combustion of Methane/Hydrogen/Propane/Air Flames in Geometrically Modified Combustor and Its Effects on Flame Properties
Authors: E. Salem
Abstract:
Combustion has been used for a long time as a means of energy extraction. However, in recent years, there has been a further increase in air pollution, through pollutants such as nitrogen oxides, acid etc. In order to solve this problem, there is a need to reduce carbon and nitrogen oxides through learn burning modifying combustors and fuel dilution. A numerical investigation has been done to investigate the effectiveness of several reduced mechanisms in terms of computational time and accuracy, for the combustion of the hydrocarbons/air or diluted with hydrogen in a micro combustor. The simulations were carried out using the ANSYS Fluent 19.1. To validate the results “PREMIX and CHEMKIN” codes were used to calculate 1D premixed flame based on the temperature, composition of burned and unburned gas mixtures. Numerical calculations were carried for several hydrocarbons by changing the equivalence ratios and adding small amounts of hydrogen into the fuel blends then analyzing the flammable limit, the reduction in NOx and CO emissions, then comparing it to experimental data. By solving the conservations equations, several global reduced mechanisms (2-9-12) were obtained. These reduced mechanisms were simulated on a 2D cylindrical tube with dimensions of 40 cm in length and 2.5 cm diameter. The mesh of the model included a proper fine quad mesh, within the first 7 cm of the tube and around the walls. By developing a proper boundary layer, several simulations were performed on hydrocarbon/air blends to visualize the flame characteristics than were compared with experimental data. Once the results were within acceptable range, the geometry of the combustor was modified through changing the length, diameter, adding hydrogen by volume, and changing the equivalence ratios from lean to rich in the fuel blends, the results on flame temperature, shape, velocity and concentrations of radicals and emissions were observed. It was determined that the reduced mechanisms provided results within an acceptable range. The variation of the inlet velocity and geometry of the tube lead to an increase of the temperature and CO2 emissions, highest temperatures were obtained in lean conditions (0.5-0.9) equivalence ratio. Addition of hydrogen blends into combustor fuel blends resulted in; reduction in CO and NOx emissions, expansion of the flammable limit, under the condition of having same laminar flow, and varying equivalence ratio with hydrogen additions. The production of NO is reduced because the combustion happens in a leaner state and helps in solving environmental problems.Keywords: combustor, equivalence-ratio, hydrogenation, premixed flames
Procedia PDF Downloads 114519 Effect of Pollutions on Mangrove Forests of Nayband National Marine Park
Authors: Esmaeil Kouhgardi, Elaheh Shakerdargah
Abstract:
The mangrove ecosystem is a complex of various inter-related elements in the land-sea interface zone which is linked with other natural systems of the coastal region such as corals, sea-grass, coastal fisheries and beach vegetation. The mangrove ecosystem consists of water, muddy soil, trees, shrubs, and their associated flora, fauna and microbes. It is a very productive ecosystem sustaining various forms of life. Its waters are nursery grounds for fish, crustacean, and mollusk and also provide habitat for a wide range of aquatic life, while the land supports a rich and diverse flora and fauna, but pollutions may affect these characteristics. Iran has the lowest share of Persian Gulf pollution among the eight littoral states; environmental experts are still deeply concerned about the serious consequences of the pollution in the oil-rich gulf. Prolongation of critical conditions in the Persian Gulf has endangered its aquatic ecosystem. Water purification equipment, refineries, wastewater emitted by onshore installations, especially petrochemical plans, urban sewage, population density and extensive oil operations of Arab states are factors contaminating the Persian Gulf waters. Population density has been the major cause of pollution and environmental degradation in the Persian Gulf. Persian Gulf is a closed marine environment which is connected to open waterways only from one way. It usually takes between three and four years for the gulf's water to be completely replaced. Therefore, any pollution entering the water will remain there for a relatively long time. Presently, the high temperature and excessive salt level in the water have exposed the marine creatures to extra threats, which mean they have to survive very tough conditions. The natural environment of the Persian Gulf is very rich with good fish grounds, extensive coral reefs and pearl oysters in abundance, but has become increasingly under pressure due to the heavy industrialization and in particular the repeated major oil spillages associated with the various recent wars fought in the region. Pollution may cause the mortality of mangrove forests by effect on root, leaf and soil of the area. Study was showed the high correlation between industrial pollution and mangrove forests health in south of Iran and increase of population, coupled with economic growth, inevitably caused the use of mangrove lands for various purposes such as construction of roads, ports and harbors, industries and urbanization.Keywords: Mangrove forest, pollution, Persian Gulf, population, environment
Procedia PDF Downloads 399518 Characteristics of the Rocks Glacier Deposits in the Southern Carpathians, Romania
Authors: Petru Urdea
Abstract:
As a distinct part of the mountain system, the rock glacier system is a particularly periglacial debris system. Being an open system, it works in a manner of interconnection with others subsystems like glacial, cliffs, rocky slopes sand talus slope subsystems, which are sources of sediments. One characteristic is that for long periods of time it is like a storage unit for debris, and ice, and temporary for snow and water. In the Southern Carpathians 306 rock glaciers were identified. The vast majority of these rock glaciers, are talus rock glaciers, 74%, and 26%, are debris rock glaciers. In the area occupied by granites and granodiorites are present, 49% of all the rock glaciers, representing 61% of the area occupied by Southern Carpathians rock glaciers. This lithological dependence also leaves its mark on the specifics of the deposits, everything bearing the imprint of the particular way the rocks respond to the physical weathering processes, all in a periglacial regime. If in the domain of granites and granodiorites the blocks are large, - of metric order, even 10 m3 - , in the domain of the metamorphic rocks only gneisses can cut similar sizes. Amphibolites, amphibolitic schists, micaschists, sericite-chlorite schists and phyllites crop out in much smaller blocks, of decimetric order, mostly in the form of slabs. In the case of rock glaciers made up of large blocks, with a strcture of open-works type, the density and volume of voids between the blocks is greater, the smaller debris generating more compact structures with fewer voids. All these influences the thermal regime, associated with a certain type of air circulation during the seasons and the emergence of permafrost formation conditions. The rock glaciers are fed by rock falls, rock avalanches, debris flows, avalanches, so that the structure is heterogeneous, which is also reflected in the detailed topography of the rock glaciers. This heterogeneity is also influenced by the spatial assembly of the rock bodies in the supply area and, an element that cannot be omitted, the behavior of the rocks during periglacial weathering. The production of small gelifracts determines the filling of voids and the appearance of more compact structures, with effects on the creep process. In general, surface deposits are coarser, those in depth are finer, their characteristics being detectable by applying geophysical methods. The electrical tomography (ERT) and georadar (GPR) investigations carried out in the Făgăraş Mountains, Retezat and the Parâng Mountains, each with a different lithological specificity, allowed the identification of some differentiations, including the presence of permafrost bodies.Keywords: rock glaciers deposits, structure, lithology, permafrost, Southern Carpathians, Romania
Procedia PDF Downloads 26517 Measuring Urban Sprawl in the Western Cape Province, South Africa: An Urban Sprawl Index for Comparative Purposes
Authors: Anele Horn, Amanda Van Eeden
Abstract:
The emphasis on the challenges posed by continued urbanisation, especially in developing countries has resulted in urban sprawl often researched and analysed in metropolitan urban areas, but rarely in small and medium towns. Consequently, there exists no comparative instrument between the proportional extent of urban sprawl in metropolitan areas measured against that of small and medium towns. This research proposes an Urban Sprawl Index as a possible tool to comparatively analyse the extent of urban sprawl between cities and towns of different sizes. The index can also be used over the longer term by authorities developing spatial policy to track the success or failure of specific tools intended to curb urban sprawl. In South Africa, as elsewhere in the world, the last two decades witnessed a proliferation of legislation and spatial policies to limit urban sprawl and contain the physical expansion and development of urban areas, but the measurement of the successes or failures of these instruments intending to curb expansive land development has remained a largely unattainable goal, largely as a result of the absence of an appropriate measure of proportionate comparison. As a result of the spatial political history of Apartheid, urban areas acquired a spatial form that contributed to the formation of single-core cities with far reaching and wide-spreading peripheral development, either in the form of affluent suburbs or as a result of post-Apartheid programmes such as the Reconstruction and Development Programme (1995) which, in an attempt to assist the immediate housing shortage, favoured the establishment of single dwelling residential units for low income communities on single plots on affordable land at the urban periphery. This invariably contributed to urban sprawl and even though this programme has since been abandoned, the trend towards low density residential development continues. The research area is the Western Cape Province in South Africa, which in all aspects exhibit the spatial challenges described above. In academia and popular media the City of Cape Town (the only Metropolitan authority in the province) has received the lion’s share of focus in terms of critique on urban development and spatial planning, however, the smaller towns and cities in the Western Cape arguably received much less public attention and were spared the naming and shaming of being unsustainable urban areas in terms of land consumption and physical expansion. The Urban Sprawl Index for the Western Cape (USIWC) put forward by this research enables local authorities in the Western Cape Province to measure the extent of urban sprawl proportionately and comparatively to other cities in the province, thereby acquiring a means of measuring the success of the spatial instruments employed to limit urban expansion and inefficient land consumption. In development of the USIWC the research made use of satellite data for reference years 2001 and 2011 and population growth data extracted from the national census, also for base years 2001 and 2011.Keywords: urban sprawl, index, Western Cape, South Africa
Procedia PDF Downloads 329516 Holistic Urban Development: Incorporating Both Global and Local Optimization
Authors: Christoph Opperer
Abstract:
The rapid urbanization of modern societies and the need for sustainable urban development demand innovative solutions that meet both individual and collective needs while addressing environmental concerns. To address these challenges, this paper presents a study that explores the potential of spatial and energetic/ecological optimization to enhance the performance of urban settlements, focusing on both architectural and urban scales. The study focuses on the application of biological principles and self-organization processes in urban planning and design, aiming to achieve a balance between ecological performance, architectural quality, and individual living conditions. The research adopts a case study approach, focusing on a 10-hectare brownfield site in the south of Vienna. The site is surrounded by a small-scale built environment as an appropriate starting point for the research and design process. However, the selected urban form is not a prerequisite for the proposed design methodology, as the findings can be applied to various urban forms and densities. The methodology used in this research involves dividing the overall building mass and program into individual small housing units. A computational model has been developed to optimize the distribution of these units, considering factors such as solar exposure/radiation, views, privacy, proximity to sources of disturbance (such as noise), and minimal internal circulation areas. The model also ensures that existing vegetation and buildings on the site are preserved and incorporated into the optimization and design process. The model allows for simultaneous optimization at two scales, architectural and urban design, which have traditionally been addressed sequentially. This holistic design approach leads to individual and collective benefits, resulting in urban environments that foster a balance between ecology and architectural quality. The results of the optimization process demonstrate a seemingly random distribution of housing units that, in fact, is a densified hybrid between traditional garden settlements and allotment settlements. This urban typology is selected due to its compatibility with the surrounding urban context, although the presented methodology can be extended to other forms of urban development and density levels. The benefits of this approach are threefold. First, it allows for the determination of ideal housing distribution that optimizes solar radiation for each building density level, essentially extending the concept of sustainable building to the urban scale. Second, the method enhances living quality by considering the orientation and positioning of individual functions within each housing unit, achieving optimal views and privacy. Third, the algorithm's flexibility and robustness facilitate the efficient implementation of urban development with various stakeholders, architects, and construction companies without compromising its performance. The core of the research is the application of global and local optimization strategies to create efficient design solutions. By considering both, the performance of individual units and the collective performance of the urban aggregation, we ensure an optimal balance between private and communal benefits. By promoting a holistic understanding of urban ecology and integrating advanced optimization strategies, our methodology offers a sustainable and efficient solution to the challenges of modern urbanization.Keywords: sustainable development, self-organization, ecological performance, solar radiation and exposure, daylight, visibility, accessibility, spatial distribution, local and global optimization
Procedia PDF Downloads 66515 Evaluation of Invasive Tree Species for Production of Phosphate Bonded Composites
Authors: Stephen Osakue Amiandamhen, Schwaller Andreas, Martina Meincken, Luvuyo Tyhoda
Abstract:
Invasive alien tree species are currently being cleared in South Africa as a result of the forest and water imbalances. These species grow wildly constituting about 40% of total forest area. They compete with the ecosystem for natural resources and are considered as ecosystem engineers by rapidly changing disturbance regimes. As such, they are harvested for commercial uses but much of it is wasted because of their form and structure. The waste is being sold to local communities as fuel wood. These species can be considered as potential feedstock for the production of phosphate bonded composites. The presence of bark in wood-based composites leads to undesirable properties, and debarking as an option can be cost implicative. This study investigates the potentials of these invasive species processed without debarking on some fundamental properties of wood-based panels. Some invasive alien tree species were collected from EC Biomass, Port Elizabeth, South Africa. They include Acacia mearnsii (Black wattle), A. longifolia (Long-leaved wattle), A. cyclops (Red-eyed wattle), A. saligna (Golden-wreath wattle) and Eucalyptus globulus (Blue gum). The logs were chipped as received. The chips were hammer-milled and screened through a 1 mm sieve. The wood particles were conditioned and the quantity of bark in the wood was determined. The binding matrix was prepared using a reactive magnesia, phosphoric acid and class S fly ash. The materials were mixed and poured into a metallic mould. The composite within the mould was compressed at room temperature at a pressure of 200 KPa. After initial setting which took about 5 minutes, the composite board was demoulded and air-cured for 72 h. The cured product was thereafter conditioned at 20°C and 70% relative humidity for 48 h. Test of physical and strength properties were conducted on the composite boards. The effect of binder formulation and fly ash content on the properties of the boards was studied using fitted response surface technology, according to a central composite experimental design (CCD) at a fixed wood loading of 75% (w/w) of total inorganic contents. The results showed that phosphate/magnesia ratio of 3:1 and fly ash content of 10% was required to obtain a product of good properties and sufficient strength for intended applications. The proposed products can be used for ceilings, partitioning and insulating wall panels.Keywords: invasive alien tree species, phosphate bonded composites, physical properties, strength
Procedia PDF Downloads 295514 Solutions of Thickening the Sludge from the Wastewater Treatment by a Rotor with Bars
Authors: Victorita Radulescu
Abstract:
Introduction: The sewage treatment plants, in the second stage, are formed by tanks having as main purpose the formation of the suspensions with high possible solid concentration values. The paper presents a solution to produce a rapid concentration of the slurry and sludge, having as main purpose the minimization as much as possible the size of the tanks. The solution is based on a rotor with bars, tested into two different areas of industrial activity: the remediation of the wastewater from the oil industry and, in the last year, into the mining industry. Basic Methods: It was designed, realized and tested a thickening system with vertical bars that manages to reduce sludge moisture content from 94% to 87%. The design was based on the hypothesis that the streamlines of the vortices detached from the rotor with vertical bars accelerate, under certain conditions, the sludge thickening. It is moved at the lateral sides, and in time, it became sediment. The formed vortices with the vertical axis in the viscous fluid, under the action of the lift, drag, weight, and inertia forces participate at a rapid aggregation of the particles thus accelerating the sludge concentration. Appears an interdependence between the Re number attached to the flow with vortex induced by the vertical bars and the size of the hydraulic compaction phenomenon, resulting from an accelerated process of sedimentation, therefore, a sludge thickening depending on the physic-chemical characteristics of the resulting sludge is projected the rotor's dimensions. Major findings/ Results: Based on the experimental measurements was performed the numerical simulation of the hydraulic rotor, as to assure the necessary vortices. The experimental measurements were performed to determine the optimal height and the density of the bars for the sludge thickening system, to assure the tanks dimensions as small as possible. The time thickening/settling was reduced by 24% compared to the conventional used systems. In the present, the thickeners intend to decrease the intermediate stage of water treatment, using primary and secondary settling; but they assume a quite long time, the order of 10-15 hours. By using this system, there are no intermediary steps; the thickening is done automatically when are created the vortices. Conclusions: The experimental tests were carried out in the wastewater treatment plant of the Refinery of oil from Brazi, near the city Ploiesti. The results prove its efficiency in reducing the time for compacting the sludge and the smaller humidity of the evacuated sediments. The utilization of this equipment is now extended and it is tested the mining industry, with significant results, in Lupeni mine, from the Jiu Valley.Keywords: experimental tests, hydrodynamic modeling, rotor efficiency, wastewater treatment
Procedia PDF Downloads 118513 In Support of Sustainable Water Resources Development in the Lower Mekong River Basin: Development of Guidelines for Transboundary Environmental Impact Assessment
Authors: Kongmeng Ly
Abstract:
The management of transboundary river basins across developing countries, such as the Lower Mekong River Basin (LMB), is frequently challenging given the development and conservation divergences of the basin countries. Driven by needs to sustain economic performance and reduce poverty, the LMB countries (Cambodia, Lao PDR, Thailand, Viet Nam) are embarking on significant land use changes in the form hydropower dam, to fulfill their energy requirements. This pathway could lead to irreversible changes to the ecosystem of the Mekong River, if not properly managed. Given the uncertain trade-offs of hydropower development and operation, the Lower Mekong River Basin Countries through the technical support of the Mekong River Commission (MRC) Secretariat embarked on decade long the development of Technical Guidelines for Transboundary Environmental Impact Assessment. Through a series of workshops, seminars, national and regional consultations, and pilot studies and further development following the recommendations generated through legal and institutional reviews undertaken over two decades period, the LMB Countries jointly adopted the MRC Technical Guidelines for Transboundary Environmental Impact Assessment (TbEIA Guidelines). These guidelines were developed with particular regard to the experience gained from MRC supported consultations and technical reviews of the Xayaburi Dam Project, Don Sahong Hydropower Project, Pak Beng Hydropower Project, and lessons learned from the Srepok River and Se San River case studies commissioned by the MRC under the generous supports of development partners around the globe. As adopted, the TbEIA Guidelines have been designed as a supporting mechanism to the national EIA legislation, processes and systems in each Member Country. In recognition of the already agreed mechanisms, the TbEIA Guidelines build on and supplement the agreements stipulated in the 1995 Agreement on the Cooperation for the Sustainable Development of the Mekong River Basin and its Procedural Rules, in addressing potential transboundary environmental impacts of development projects and ensuring mutual benefits from the Mekong River and its resources. Since its adoption in 2022, the TbEIA Guidelines have already been voluntary implemented by Lao PDR on its underdevelopment Sekong A Downstream Hydropower Project, located on the Sekong River – a major tributary of the Mekong River. While this implementation is ongoing with results expected in early 2024, the implementation thus far has strengthened cooperation among concerned Member Countries with multiple successful open dialogues organized at national and regional levels. It is hope that lessons learnt from this application would lead to a wider application of the TbEIA Guidelines for future water resources development projects in the LMB.Keywords: transboundary, EIA, lower mekong river basin, mekong river
Procedia PDF Downloads 37512 Immobilization of Superoxide Dismutase Enzyme on Layered Double Hydroxide Nanoparticles
Authors: Istvan Szilagyi, Marko Pavlovic, Paul Rouster
Abstract:
Antioxidant enzymes are the most efficient defense systems against reactive oxygen species, which cause severe damage in living organisms and industrial products. However, their supplementation is problematic due to their high sensitivity to the environmental conditions. Immobilization on carrier nanoparticles is a promising research direction towards the improvement of their functional and colloidal stability. In that way, their applications in biomedical treatments and manufacturing processes in the food, textile and cosmetic industry can be extended. The main goal of the present research was to prepare and formulate antioxidant bionanocomposites composed of superoxide dismutase (SOD) enzyme, anionic clay (layered double hydroxide, LDH) nanoparticle and heparin (HEP) polyelectrolyte. To characterize the structure and the colloidal stability of the obtained compounds in suspension and solid state, electrophoresis, dynamic light scattering, transmission electron microscopy, spectrophotometry, thermogravimetry, X-ray diffraction, infrared and fluorescence spectroscopy were used as experimental techniques. LDH-SOD composite was synthesized by enzyme immobilization on the clay particles via electrostatic and hydrophobic interactions, which resulted in a strong adsorption of the SOD on the LDH surface, i.e., no enzyme leakage was observed once the material was suspended in aqueous solutions. However, the LDH-SOD showed only limited resistance against salt-induced aggregation and large irregularly shaped clusters formed during short term interval even at lower ionic strengths. Since sufficiently high colloidal stability is a key requirement in most of the applications mentioned above, the nanocomposite was coated with HEP polyelectrolyte to develop highly stable suspensions of primary LDH-SOD-HEP particles. HEP is a natural anticoagulant with one of the highest negative line charge density among the known macromolecules. The experimental results indicated that it strongly adsorbed on the oppositely charged LDH-SOD surface leading to charge inversion and to the formation of negatively charged LDH-SOD-HEP. The obtained hybrid materials formed stable suspension even under extreme conditions, where classical colloid chemistry theories predict rapid aggregation of the particles and unstable suspensions. Such a stabilization effect originated from electrostatic repulsion between the particles of the same sign of charge as well as from steric repulsion due to the osmotic pressure raised during the overlap of the polyelectrolyte chains adsorbed on the surface. In addition, the SOD enzyme kept its structural and functional integrity during the immobilization and coating processes and hence, the LDH-SOD-HEP bionanocomposite possessed excellent activity in decomposition of superoxide radical anions, as revealed in biochemical test reactions. In conclusion, due to the improved colloidal stability and the good efficiency in scavenging superoxide radical ions, the developed enzymatic system is a promising antioxidant candidate for biomedical or other manufacturing processes, wherever the aim is to decompose reactive oxygen species in suspensions.Keywords: clay, enzyme, polyelectrolyte, formulation
Procedia PDF Downloads 268511 Polymeric Composites with Synergetic Carbon and Layered Metallic Compounds for Supercapacitor Application
Authors: Anukul K. Thakur, Ram Bilash Choudhary, Mandira Majumder
Abstract:
In this technologically driven world, it is requisite to develop better, faster and smaller electronic devices for various applications to keep pace with fast developing modern life. In addition, it is also required to develop sustainable and clean sources of energy in this era where the environment is being threatened by pollution and its severe consequences. Supercapacitor has gained tremendous attention in the recent years because of its various attractive properties such as it is essentially maintenance-free, high specific power, high power density, excellent pulse charge/discharge characteristics, exhibiting a long cycle-life, require a very simple charging circuit and safe operation. Binary and ternary composites of conducting polymers with carbon and other layered transition metal dichalcogenides have shown tremendous progress in the last few decades. Compared with bulk conducting polymer, these days conducting polymers have gained more attention because of their high electrical conductivity, large surface area, short length for the ion transport and superior electrochemical activity. These properties make them very suitable for several energy storage applications. On the other hand, carbon materials have also been studied intensively, owing to its rich specific surface area, very light weight, excellent chemical-mechanical property and a wide range of the operating temperature. These have been extensively employed in the fabrication of carbon-based energy storage devices and also as an electrode material in supercapacitors. Incorporation of carbon materials into the polymers increases the electrical conductivity of the polymeric composite so formed due to high electrical conductivity, high surface area and interconnectivity of the carbon. Further, polymeric composites based on layered transition metal dichalcogenides such as molybdenum disulfide (MoS2) are also considered important because they are thin indirect band gap semiconductors with a band gap around 1.2 to 1.9eV. Amongst the various 2D materials, MoS2 has received much attention because of its unique structure consisting of a graphene-like hexagonal arrangement of Mo and S atoms stacked layer by layer to give S-Mo-S sandwiches with weak Van-der-Waal forces between them. It shows higher intrinsic fast ionic conductivity than oxides and higher theoretical capacitance than the graphite.Keywords: supercapacitor, layered transition-metal dichalcogenide, conducting polymer, ternary, carbon
Procedia PDF Downloads 256510 Achieving Net Zero Energy Building in a Hot Climate Using Integrated Photovoltaic and Parabolic Trough Collectors
Authors: Adel A. Ghoneim
Abstract:
In most existing buildings in hot climate, cooling loads lead to high primary energy consumption and consequently high CO2 emissions. These can be substantially decreased with integrated renewable energy systems. Kuwait is characterized by its dry hot long summer and short warm winter. Kuwait receives annual total radiation more than 5280 MJ/m2 with approximately 3347 h of sunshine. Solar energy systems consist of PV modules and parabolic trough collectors are considered to satisfy electricity consumption, domestic water heating, and cooling loads of an existing building. This paper presents the results of an extensive program of energy conservation and energy generation using integrated photovoltaic (PV) modules and parabolic trough collectors (PTC). The program conducted on an existing institutional building intending to convert it into a Net-Zero Energy Building (NZEB) or near net Zero Energy Building (nNZEB). The program consists of two phases; the first phase is concerned with energy auditing and energy conservation measures at minimum cost and the second phase considers the installation of photovoltaic modules and parabolic trough collectors. The 2-storey building under consideration is the Applied Sciences Department at the College of Technological Studies, Kuwait. Single effect lithium bromide water absorption chillers are implemented to provide air conditioning load to the building. A numerical model is developed to evaluate the performance of parabolic trough collectors in Kuwait climate. Transient simulation program (TRNSYS) is adapted to simulate the performance of different solar system components. In addition, a numerical model is developed to assess the environmental impacts of building integrated renewable energy systems. Results indicate that efficient energy conservation can play an important role in converting the existing buildings into NZEBs as it saves a significant portion of annual energy consumption of the building. The first phase results in an energy conservation of about 28% of the building consumption. In the second phase, the integrated PV completely covers the lighting and equipment loads of the building. On the other hand, parabolic trough collectors of optimum area of 765 m2 can satisfy a significant portion of the cooling load, i.e about73% of the total building cooling load. The annual avoided CO2 emission is evaluated at the optimum conditions to assess the environmental impacts of renewable energy systems. The total annual avoided CO2 emission is about 680 metric ton/year which confirms the environmental impacts of these systems in Kuwait.Keywords: building integrated renewable systems, Net-Zero energy building, solar fraction, avoided CO2 emission
Procedia PDF Downloads 611509 The Feasibility of Ratification of the United Nation Convention on Contracts for International Sale of Goods by Islamic Countries, Saudi Arabia as a Case
Authors: Ibrahim M. Alwehaibi
Abstract:
Recently the windows of globalization weirdly open, which increase the trade between the Western countries and Muslim nations. Sales of goods contracts are one of the most common business transaction in the world. This commercial exchange has faced many obstacles. One of the most concerned obstacles is the conflicts between laws. Thus, United Nation created a Convention on Contracts for the International Sale of Goods (CISG). Some of Islamic countries have ratified the CISG, while other Islamic countries have concerns about the feasibility of ratification of the CISG, and many businessmen have a concern of application of the convention. The concerns related to the conflict between CISG and Sharia, and the long debate about the success, ambiguity, and stability of the CISG. Therefore, this research will examine the feasibility of Muslim countries and Muslim businessmen to adopt the CISG by following steps: First, this research will introduce sharia Law (Islamic contracts law) and CISG and provide backgrounds of both laws. Second, this research will compare the provisions of CISG and Sharia and figuring out the conflicts and provide possible solutions for the conflicts. Third, this study will examine the advantages and disadvantages of adopting the CISG and examining the success of the CISG. Fourth, this study will explore the current situation in Islamic countries by taking Saudi Arabia as a case and explore how the application of Sharia law works and the possibility to enforce the CISG and explore the current practice of foreign Sales in Saudi Arabia. The research finds that there are some conflicts between CISG and Sharia Law. The most notable conflicts are interest and uncertainty in considerations. Also, this research finds that it seems that ratification of CISG is not beneficial for Muslim countries because the convention has not reached its goal which is uniformity of laws. Moreover, the CISG has been excluded and ignored by businessmen and some courts. Additionally, this research finds that it could be possible to enforce CISG in Saudi Arabia, provided that no conflict between the enforced provision and Sharia Law. This study is following the competitive and analysis methodologies to reach its findings. The researcher analyzes the provision of CISG and compares them with Sharia rules and finds the conflicts and compatibilities. In fact, CISG has 101 articles, so a comprehensive comparison of all articles in CISG with Sharia is difficult. Thus, in order to deeply analyze all aspects of this issue, this study will exclude some areas of contract which have been discussed by other researchers such as deliver of goods, conformity, and mirror image rules. The comparative section of this study will focus on the most concerned articles that conflict or doubtful of conflict with Sharia, which are interest, uncertainty, statute of limitation, specific performance, and pass of risk.Keywords: Sharia, CISG, Contracts for International Sale of Goods, contracts, sale of goods, Saudi Arabia
Procedia PDF Downloads 151508 Torn Between the Lines of Border: The Pakhtuns of Pakistan and Afghanistan in Search of Identity
Authors: Priyanka Dutta Chowdhury
Abstract:
A globalized connected world, calling loud for a composite culture, was still not able to erase the pain of a desired nationalism based on cultural identity. In the South Asian region, the random drawing of the boundaries without taking the ethnic aspect into consideration have always challenged the very basis of the existence of certain groups. The urge to reunify with the fellow brothers on both sides of the border have always called for a chaos and schism in the countries of this region. Sometimes this became a tool to bargain with the state and find a favorable position in the power structure on the basis of cultural identity. In Pakistan and Afghanistan, the Pakhtuns who are divided across the border of the two countries, from the inception of creation of Pakistan have posed various challenges and hampered the growth of a consolidated nation. The Pakhtuns or Pashtuns of both Pakistan and Afghanistan have a strong cultural affinity which blurs their physical distancing and calls for a nationalism based on this ethnic affiliation. Both the sides wanted to create Pakhtunistan unifying all the Pakhtuns of the region. For long, this group have denied to accept the Durand line separating the two. This was an area of concern especially for the Pakhtuns of Pakistan torn between the decision either to join Afghanistan, create a nation of their own or be a part of Pakistan. This ethnic issue became a bone of contention between the two countries. Later, though well absorbed and recognized in the respective countries, they have fought for their identity and claimed for a dominant position in the politics of the nations. Because of the porous borders often influx of refugees was seen especially during Afghan Wars and later many extremists’ groups were born from them especially the Taliban. In the recent string of events, when the Taliban, who are mostly Pakhtuns ethnically, came in power in Afghanistan, a wave of sympathy arose in Pakistan. This gave a strengthening position to the religious Pakhtuns across the border. It is to be noted here that a fragmented Pakhtun identity between the religious and seculars were clearly visible, voicing for their place in the political hierarchy of the country with a vision distinct from each other especially in Pakistan. In this context the paper tries to evaluate the reasons for this cultural turmoil between the countries and this ethnic group. It also aims to analyze the concept of how the identity politics still holds its relevance in the contemporary world. Additionally, the recent trend of fragmented identity points towards instrumentalization of this ethnic groups, who are engaged in the bargaining process with the state for a robust position in the power structure. In the end, the paper aims to deduct from the theoretical conditions of identity politics, whether this is a primordial or a situational tool to have a visibility in the power structure of the contemporary world.Keywords: cultural identity, identity politics, instrumentalization of identity pakhtuns, power structure
Procedia PDF Downloads 82507 Impact of Microwave and Air Velocity on Drying Kinetics and Rehydration of Potato Slices
Authors: Caiyun Liu, A. Hernandez-Manas, N. Grimi, E. Vorobiev
Abstract:
Drying is one of the most used methods for food preservation, which extend shelf life of food and makes their transportation, storage and packaging easier and more economic. The commonly dried method is hot air drying. However, its disadvantages are low energy efficiency and long drying times. Because of the high temperature during the hot air drying, the undesirable changes in pigments, vitamins and flavoring agents occur which result in degradation of the quality parameters of the product. Drying process can also cause shrinkage, case hardening, dark color, browning, loss of nutrients and others. Recently, new processes were developed in order to avoid these problems. For example, the application of pulsed electric field provokes cell membrane permeabilisation, which increases the drying kinetics and moisture diffusion coefficient. Microwave drying technology has also several advantages over conventional hot air drying, such as higher drying rates and thermal efficiency, shorter drying time, significantly improved product quality and nutritional value. Rehydration kinetics of dried product is a very important characteristic of dried products. Current research has indicated that the rehydration ratio and the coefficient of rehydration are dependent on the processing conditions of drying. The present study compares the efficiency of two processes (1: room temperature air drying, 2: microwave/air drying) in terms of drying rate, product quality and rehydration ratio. In this work, potato slices (≈2.2g) with a thickness of 2 mm and diameter of 33mm were placed in the microwave chamber and dried. Drying kinetics and drying rates of different methods were determined. The process parameters included inlet air velocity (1 m/s, 1.5 m/s, 2 m/s) and microwave power (50 W, 100 W, 200 W and 250 W) were studied. The evolution of temperature during microwave drying was measured. The drying power had a strong effect on drying rate, and the microwave-air drying resulted in 93% decrease in the drying time when the air velocity was 2 m/s and the power of microwave was 250 W. Based on Lewis model, drying rate constants (kDR) were determined. It was observed an increase from kDR=0.0002 s-1 to kDR=0.0032 s-1 of air velocity of 2 m/s and microwave/air (at 2m/s and 250W) respectively. The effective moisture diffusivity was calculated by using Fick's law. The results show an increase of effective moisture diffusivity from 7.52×10-11 to 2.64×10-9 m2.s-1 for air velocity of 2 m/s and microwave/air (at 2m/s and 250W) respectively. The temperature of the potato slices increased for higher microwaves power, but decreased for higher air velocity. The rehydration ratio, defined as the weight of the the sample after rehydration per the weight of dried sample, was determined at different water temperatures (25℃, 50℃, 75℃). The rehydration ratio increased with the water temperature and reached its maximum at the following conditions: 200 W for the microwave power, 2 m/s for the air velocity and 75°C for the water temperature. The present study shows the interest of microwave drying for the food preservation.Keywords: drying, microwave, potato, rehydration
Procedia PDF Downloads 269506 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality
Authors: Peregrine James Dalziel, Philip Vu Tran
Abstract:
Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.Keywords: workflow, quality, administration, CT, staffing
Procedia PDF Downloads 112505 Phage Therapy as a Potential Solution in the Fight against Antimicrobial Resistance
Authors: Sanjay Shukla
Abstract:
Excessive use of antibiotics is a main problem in the treatment of wounds and other chronic infections and antibiotic treatment is frequently non-curative, thus alternative treatment is necessary. Phage therapy is considered one of the most effective approaches to treat multi-drug resistant bacterial pathogens. Infections caused by Staphylococcus aureus are very efficiently controlled with phage cocktails, containing a different individual phages lysate infecting a majority of known pathogenic S. aureus strains. The aim of current study was to investigate the efficiency of a purified phage cocktail for prophylactic as well as therapeutic application in mouse model and in large animals with chronic septic infection of wounds. A total of 150 sewage samples were collected from various livestock farms. These samples were subjected for the isolation of bacteriophage by double agar layer method. A total of 27 sewage samples showed plaque formation by producing lytic activity against S. aureus in double agar overlay method out of 150 sewage samples. In TEM recovered isolates of bacteriophages showed hexagonal structure with tail fiber. In the bacteriophage (ØVS) had an icosahedral symmetry with the head size 52.20 nm in diameter and long tail of 109 nm. Head and tail were held together by connector and can be classified as a member of the Myoviridae family under the order of Caudovirale. Recovered bacteriophage had shown the antibacterial activity against the S. aureus in vitro. Cocktail (ØVS1, ØVS5, ØVS9 and ØVS 27) of phage lysate were tested to know in vivo antibacterial activity as well as the safety profile. Result of mice experiment indicated that the bacteriophage lysate was very safe, did not show any appearance of abscess formation which indicates its safety in living system. The mice were also prophylactically protected against S. aureus when administered with cocktail of bacteriophage lysate just before the administration of S. aureus which indicates that they are good prophylactic agent. The S. aureus inoculated mice were completely recovered by bacteriophage administration with 100% recovery which was very good as compere to conventional therapy. In present study ten chronic cases of wound were treated with phage lysate and follow up of these cases was done regularly up to ten days (at 0, 5 and 10 d). Result indicated that the six cases out of ten showed complete recovery of wounds within 10 d. The efficacy of bacteriophage therapy was found to be 60% which was very good as compared to the conventional antibiotic therapy in chronic septic wounds infections. Thus, the application of lytic phage in single dose proved to be innovative and effective therapy for treatment of septic chronic wounds.Keywords: phage therapy, phage lysate, antimicrobial resistance, S. aureus
Procedia PDF Downloads 118504 Particle Separation Using Individually-Controlled Magnetic Soft Artificial Cilia
Authors: Yau-Luen Ng, Nathan Banka, Santosh Devasia
Abstract:
In this paper, a method based on soft artificial cilia is introduced to separate particles based on size and mass. In nature, cilia are used for fluid propulsion in the mammalian circulatory system, as well as for swimming and size-selective particle entrainment for feeding in microorganisms. Inspired by biological cilia, an array of artificial cilia was fabricated using Polydimethylsiloxane (PDMS) to simulate the actual motion. A row of four individually-controlled magnetic artificial cilia in a semi-circular channel are actuated by the magnetic fields from four permanent magnets. Each cilium is a slender rectangular cantilever approximately 13mm long made from a composite of PDMS and carbonyl iron particles. A time-varying magnetic force is achieved by periodically varying the out-of-plane distance from the permanent magnets to the cilia, resulting in large-amplitude deflections of the cilia that can be used to drive fluid motion. Previous results have shown that this system of individually-controlled magnetic cilia can generate effective mixing flows; the purpose of the present work is to apply the individual cilia control to a particle separation task. Based on the observed beating patterns of cilia arrays in nature, the experimental beating patterns were selected as a metachronal wave, in which a fixed phase lead or lag is imposed between adjacent cilia. Additionally, the beating frequency was varied. For each set of experimental parameters, the channel was filled with water and polyethylene microspheres introduced at the center of the cilia array. Two types of particles were used: large red microspheres with density 0.9971 g/cm³ and 850-1000 μm avg. diameter, and small blue microspheres with density 1.06 g/cm³ and diameter 30 μm. At low beating frequencies, all particles were propelled in the mean flow direction. However, the large particles were observed to reverse directions above about 4.8 Hz, whereas reversal of the small particle transport direction did not occur until 6 Hz. Between these two transition frequencies, the large and small particles can be separated as they move in opposite directions. The experimental results show that selecting an appropriate cilia beating pattern can lead to selective transport of neutrally-buoyant particles based on their size. Importantly, the separation threshold can be chosen dynamically by adjusting the actuation frequency. However, further study is required to determine the range of particle sizes that can be effectively separated for a given system geometry.Keywords: magnetic cilia, particle separation, tunable separation, soft actutors
Procedia PDF Downloads 199503 Investigations of Effective Marketing Metric Strategies: The Case of St. George Brewery Factory, Ethiopia
Authors: Mekdes Getu Chekol, Biniam Tedros Kahsay, Rahwa Berihu Haile
Abstract:
The main objective of this study is to investigate the marketing strategy practice in the Case of St. George Brewery Factory in Addis Ababa. One of the core activities in a Business Company to stay in business is having a well-developed marketing strategy. It assessed how the marketing strategies were practiced in the company to achieve its goals aligned with segmentation, target market, positioning, and the marketing mix elements to satisfy customer requirements. Using primary and secondary data, the study is conducted by using both qualitative and quantitative approaches. The primary data was collected through open and closed-ended questionnaires. Considering the size of the population is small, the selection of the respondents was carried out by using a census. The finding shows that the company used all the 4 Ps of the marketing mix elements in its marketing strategies and provided quality products at affordable prices by promoting its products by using high and effective advertising mechanisms. The product availability and accessibility are admirable with the practices of both direct and indirect distribution channels. On the other hand, the company has identified its target customers, and the company’s market segmentation practice is geographical location. Communication effectiveness between the marketing department and other departments is very good. The adjusted R2 model explains 61.6% of the marketing strategy practice variance by product, price, promotion, and place. The remaining 38.4% of variation in the dependent variable was explained by other factors not included in this study. The result reveals that all four independent variables, product, price, promotion, and place, have a positive beta sign, proving that predictor variables have a positive effect on that of the predicting dependent variable marketing strategy practice. Even though the marketing strategies of the company are effectively practiced, there are some problems that the company faces while implementing them. These are infrastructure problems, economic problems, intensive competition in the market, shortage of raw materials, seasonality of consumption, socio-cultural problems, and the time and cost of awareness creation for the customers. Finally, the authors suggest that the company better develop a long-range view and try to implement a more structured approach to attain information about potential customers, competitor’s actions, and market intelligence within the industry. In addition, we recommend conducting the study by increasing the sample size and including different marketing factors.Keywords: marketing strategy, market segmentation, target marketing, market positioning, marketing mix
Procedia PDF Downloads 60502 Modelling the Antecedents of Supply Chain Enablers in Online Groceries Using Interpretive Structural Modelling and MICMAC Analysis
Authors: Rose Antony, Vivekanand B. Khanapuri, Karuna Jain
Abstract:
Online groceries have transformed the way the supply chains are managed. These are facing numerous challenges in terms of product wastages, low margins, long breakeven to achieve and low market penetration to mention a few. The e-grocery chains need to overcome these challenges in order to survive the competition. The purpose of this paper is to carry out a structural analysis of the enablers in e-grocery chains by applying Interpretive Structural Modeling (ISM) and MICMAC analysis in the Indian context. The research design is descriptive-explanatory in nature. The enablers have been identified from the literature and through semi-structured interviews conducted among the managers having relevant experience in e-grocery supply chains. The experts have been contacted through professional/social networks by adopting a purposive snowball sampling technique. The interviews have been transcribed, and manual coding is carried using open and axial coding method. The key enablers are categorized into themes, and the contextual relationship between these and the performance measures is sought from the Industry veterans. Using ISM, the hierarchical model of the enablers is developed and MICMAC analysis identifies the driver and dependence powers. Based on the driver-dependence power the enablers are categorized into four clusters namely independent, autonomous, dependent and linkage. The analysis found that information technology (IT) and manpower training acts as key enablers towards reducing the lead time and enhancing the online service quality. Many of the enablers fall under the linkage cluster viz., frequent software updating, branding, the number of delivery boys, order processing, benchmarking, product freshness and customized applications for different stakeholders, depicting these as critical in online food/grocery supply chains. Considering the perishability nature of the product being handled, the impact of the enablers on the product quality is also identified. Hence, study aids as a tool to identify and prioritize the vital enablers in the e-grocery supply chain. The work is perhaps unique, which identifies the complex relationships among the supply chain enablers in fresh food for e-groceries and linking them to the performance measures. It contributes to the knowledge of supply chain management in general and e-retailing in particular. The approach focus on the fresh food supply chains in the Indian context and hence will be applicable in developing economies context, where supply chains are evolving.Keywords: interpretive structural modelling (ISM), India, online grocery, retail operations, supply chain management
Procedia PDF Downloads 204501 An Analysis of LoRa Networks for Rainforest Monitoring
Authors: Rafael Castilho Carvalho, Edjair de Souza Mota
Abstract:
As the largest contributor to the biogeochemical functioning of the Earth system, the Amazon Rainforest has the greatest biodiversity on the planet, harboring about 15% of all the world's flora. Recognition and preservation are the focus of research that seeks to mitigate drastic changes, especially anthropic ones, which irreversibly affect this biome. Functional and low-cost monitoring alternatives to reduce these impacts are a priority, such as those using technologies such as Low Power Wide Area Networks (LPWAN). Promising, reliable, secure and with low energy consumption, LPWAN can connect thousands of IoT devices, and in particular, LoRa is considered one of the most successful solutions to facilitate forest monitoring applications. Despite this, the forest environment, in particular the Amazon Rainforest, is a challenge for these technologies, requiring work to identify and validate the use of technology in a real environment. To investigate the feasibility of deploying LPWAN in remote water quality monitoring of rivers in the Amazon Region, a LoRa-based test bed consisting of a Lora transmitter and a LoRa receiver was set up, both parts were implemented with Arduino and the LoRa chip SX1276. The experiment was carried out at the Federal University of Amazonas, which contains one of the largest urban forests in Brazil. There are several springs inside the forest, and the main goal is to collect water quality parameters and transmit the data through the forest in real time to the gateway at the uni. In all, there are nine water quality parameters of interest. Even with a high collection frequency, the amount of information that must be sent to the gateway is small. However, for this application, the battery of the transmitter device is a concern since, in the real application, the device must run without maintenance for long periods of time. With these constraints in mind, parameters such as Spreading Factor (SF) and Coding Rate (CR), different antenna heights, and distances were tuned to better the connectivity quality, measured with RSSI and loss rate. A handheld spectrum analyzer RF Explorer was used to get the RSSI values. Distances exceeding 200 m have soon proven difficult to establish communication due to the dense foliage and high humidity. The optimal combinations of SF-CR values were 8-5 and 9-5, showing the lowest packet loss rates, 5% and 17%, respectively, with a signal strength of approximately -120 dBm, these being the best settings for this study so far. The rains and climate changes imposed limitations on the equipment, and more tests are already being conducted. Subsequently, the range of the LoRa configuration must be extended using a mesh topology, especially because at least three different collection points in the same water body are required.Keywords: IoT, LPWAN, LoRa, coverage, loss rate, forest
Procedia PDF Downloads 86500 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology
Authors: Amarendar Reddy Addula
Abstract:
Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.Keywords: artificial intelligence, ethics & human rights issues, laws, international laws
Procedia PDF Downloads 94499 From Servicescape to Servicespace: Qualitative Research in a Post-Cartesian Retail Context
Authors: Chris Houliez
Abstract:
This study addresses the complex dynamics of the modern retail environment, focusing on how the ubiquitous nature of mobile communication technologies has reshaped the shopper experience and tested the limits of the conventional "servicescape" concept commonly used to describe retail experiences. The objective is to redefine the conceptualization of retail space by introducing an approach to space that aligns with a retail context where physical and digital interactions are increasingly intertwined. To offer a more shopper-centric understanding of the retail experience, this study draws from phenomenology, particularly Henri Lefebvre’s work on the production of space. The presented protocol differs from traditional methodologies by not making assumptions about what constitutes a retail space. Instead, it adopts a perspective based on Lefebvre’s seminal work, which posits that space is not a three-dimensional container commonly referred to as “servicescape” but is actively produced through shoppers’ spatial practices. This approach allows for an in-depth exploration of the retail experience by capturing the everyday spatial practices of shoppers without preconceived notions of what constitutes a retail space. The designed protocol was tested with eight participants during 209 hours of day-long field trips, immersing the researcher into the shopper's lived experience by combining multiple data collection methods, including participant observation, videography, photography, and both pre-fieldwork and post-fieldwork interviews. By giving equal importance to both locations and connections, this study unpacked various spatial practices that contribute to the production of retail space. These findings highlight the relative inadequacy of some traditional retail space conceptualizations, which often fail to capture the fluid nature of contemporary shopping experiences. The study's emphasis on the customization process, through which shoppers optimize their retail experience by producing a “fully lived retail space,” offers a more comprehensive understanding of consumer shopping behavior in the digital age. In conclusion, this research presents a significant shift in the conceptualization of retail space. By employing a phenomenological approach rooted in Lefebvre’s theory, the study provides a more efficient framework to understand the retail experience in the age of mobile communication technologies. Although this research is limited by its small sample size and the demographic profile of participants, it offers valuable insights into the spatial practices of modern shoppers and their implications for retail researchers and retailers alike.Keywords: shopper behavior, mobile telecommunication technologies, qualitative research, servicescape, servicespace
Procedia PDF Downloads 22498 The Recorded Interaction Task: A Validation Study of a New Observational Tool to Assess Mother-Infant Bonding
Authors: Hannah Edwards, Femke T. A. Buisman-Pijlman, Adrian Esterman, Craig Phillips, Sandra Orgeig, Andrea Gordon
Abstract:
Mother-infant bonding is a term which refers to the early emotional connectedness between a mother and her infant. Strong mother-infant bonding promotes higher quality mother and infant interactions including prolonged breastfeeding, secure attachment and increased sensitive parenting and maternal responsiveness. Strengthening of all such interactions leads to improved social behavior, and emotional and cognitive development throughout childhood, adolescence and adulthood. The positive outcomes observed following strong mother-infant bonding emphasize the need to screen new mothers for disrupted mother-infant bonding, and in turn the need for a robust, valid tool to assess mother-infant bonding. A recent scoping review conducted by the research team identified four tools to assess mother-infant bonding, all of which employed self-rating scales. Thus, whilst these tools demonstrated both adequate validity and reliability, they rely on self-reported information from the mother. As such this may reflect a mother’s perception of bonding with their infant, rather than their actual behavior. Therefore, a new tool to assess mother-infant bonding has been developed. The Recorded Interaction Task (RIT) addresses shortcomings of previous tools by employing observational methods to assess bonding. The RIT focusses on the common interaction between mother and infant of changing a nappy, at the target age of 2-6 months, which is visually recorded and then later assessed. Thirteen maternal and seven infant behaviors are scored on the RIT Observation Scoring Sheet, and a final combined score of mother-infant bonding is determined. The aim of the current study was to assess the content validity and inter-rater reliability of the RIT. A panel of six experts with specialized expertise in bonding and infant behavior were consulted. Experts were provided with the RIT Observation Scoring Sheet, a visual recording of a nappy change interaction, and a feedback form. Experts scored the mother and infant interaction on the RIT Observation Scoring Sheet and completed the feedback form which collected their opinions on the validity of each item on the RIT Observation Scoring Sheet and the RIT as a whole. Twelve of the 20 items on the RIT Observation Scoring Sheet were scored ‘Valid’ by all (n=6) or most (n=5) experts. Two items received a ‘Not valid’ score from one expert. The remainder of the items received a mixture of ‘Valid’ and ‘Potentially Valid’ scores. Few changes were made to the RIT Observation Scoring Sheet following expert feedback, including rewording of items for clarity and the exclusion of an item focusing on behavior deemed not relevant for the target infant age. The overall ICC for single rater absolute agreement was 0.48 (95% CI 0.28 – 0.71). Experts (n=6) ratings were less consistent for infant behavior (ICC 0.27 (-0.01 – 0.82)) compared to mother behavior (ICC 0.55 (0.28 – 0.80)). Whilst previous tools employ self-report methods to assess mother-infant bonding, the RIT utilizes observational methods. The current study highlights adequate content validity and moderate inter-rater reliability of the RIT, supporting its use in future research. A convergent validity study comparing the RIT against an existing tool is currently being undertaken to confirm these results.Keywords: content validity, inter-rater reliability, mother-infant bonding, observational tool, recorded interaction task
Procedia PDF Downloads 180497 A Comparison of qCON/qNOX to the Bispectral Index as Indices of Antinociception in Surgical Patients Undergoing General Anesthesia with Laryngeal Mask Airway
Authors: Roya Yumul, Ofelia Loani Elvir-Lazo, Sevan Komshian, Ruby Wang, Jun Tang
Abstract:
BACKGROUND: An objective means for monitoring the anti-nociceptive effects of perioperative medications has long been desired as a way to provide anesthesiologists information regarding a patient’s level of antinociception and preclude any untoward autonomic responses and reflexive muscular movements from painful stimuli intraoperatively. To this end, electroencephalogram (EEG) based tools including BIS and qCON were designed to provide information about the depth of sedation while qNOX was produced to inform on the degree of antinociception. The goal of this study was to compare the reliability of qCON/qNOX to BIS as specific indicators of response to nociceptive stimulation. METHODS: Sixty-two patients undergoing general anesthesia with LMA were included in this study. Institutional Review Board (IRB) approval was obtained, and informed consent was acquired prior to patient enrollment. Inclusion criteria included American Society of Anesthesiologists (ASA) class I-III, 18 to 80 years of age, and either gender. Exclusion criteria included the inability to consent. Withdrawal criteria included conversion to the endotracheal tube and EEG malfunction. BIS and qCON/qNOX electrodes were simultaneously placed on all patients prior to induction of anesthesia and were monitored throughout the case, along with other perioperative data, including patient response to noxious stimuli. All intraoperative decisions were made by the primary anesthesiologist without influence from qCON/qNOX. Student’s t-distribution, prediction probability (PK), and ANOVA were used to statistically compare the relative ability to detect nociceptive stimuli for each index. Twenty patients were included for the preliminary analysis. RESULTS: A comparison of overall intraoperative BIS, qCON and qNOX indices demonstrated no significant difference between the three measures (N=62, p> 0.05). Meanwhile, index values for qNOX (62±18) were significantly higher than those for BIS (46±14) and qCON (54±19) immediately preceding patient responses to nociceptive stimulation in a preliminary analysis (N=20, * p= 0.0408). Notably, certain hemodynamic measurements demonstrated a significant increase in response to painful stimuli (MAP increased from 74 ±13 mm Hg at baseline to 84 ± 18 mm Hg during noxious stimuli [p= 0.032] and HR from 76 ± 12 BPM at baseline to 80 ± 13 BPM during noxious stimuli [p=0.078] respectively). CONCLUSION: In this observational study, BIS and qCON/qNOX provided comparable information on patients’ level of sedation throughout the course of an anesthetic. Meanwhile, increases in qNOX values demonstrated a superior correlation to an imminent response to stimulation relative to all other indicesKeywords: antinociception, BIS, general anesthesia, LMA, qCON/qNOX
Procedia PDF Downloads 137496 Downward Vertical Evacuation for Disabilities People from Tsunami Using Escape Bunker Technology
Authors: Febrian Tegar Wicaksana, Niqmatul Kurniati, Surya Nandika
Abstract:
Indonesia is one of the countries that have great number of disaster occurrence and threat because it is located in not only between three tectonic plates such as Eurasia plates, Indo-Australia plates and Pacific plates, but also in the Ring of Fire path, like earthquake, Tsunami, volcanic eruption and many more. Recently, research shows that there are potential areas that will be devastated by Tsunami in southern coast of Java. Tsunami is a series of waves in a body of water caused by the displacement of a large volume of water, generally in an ocean. When the waves enter shallow water, they may rise to several feet or, in rare cases, tens of feet, striking the coast with devastating force. The parameter for reference such as magnitude, the depth of epicentre, distance between epicentres with land, the depth of every points, when reached the shore and the growth of waves. Interaction between parameters will bring the big variance of Tsunami wave. Based on that, we can formulate preparation that needed for disaster mitigation strategies. The mitigation strategies will take the important role in an effort to reduce the number of victims and damage in the area. It will reduce the number of victim and casualties. Reducing is directed to the most difficult mobilization casualties in the tsunami disaster area like old people, sick people and disabilities people. Until now, the method that used for rescuing people from Tsunami is basic horizontal evacuation. This evacuation system is not optimal because it needs so long time and it cannot be used by people with disabilities. The writers propose to create a vertical evacuation model with an escape bunker system. This bunker system is chosen because the downward vertical evacuation is considered more efficient and faster. Especially in coastal areas without any highlands surround it. The downward evacuation system is better than upward evacuation because it can avoid the risk of erosion at the ground around the structure which can affect the building. The structure of the bunker and the evacuation process while, and even after, disaster are the main priority to be considered. The power of bunker has quake’s resistance, the durability from water stream, variety of interaction to the ground, and waterproof design. When the situation is back to normal, victim and casualties can go into the safer place. The bunker will be located near the hospital and public places, and will have wide entrance supported by large slide in it so it will ease the disabilities people. The technology of the escape bunker system is expected to reduce the number of victims who have low mobility in the Tsunami.Keywords: escape bunker, tsunami, vertical evacuation, mitigation, disaster management
Procedia PDF Downloads 492495 Synaesthetic Metaphors in Persian: a Cognitive Corpus Based and Comparative Perspective
Authors: A. Afrashi
Abstract:
Introduction: Synaesthesia is a term denoting the perception or description of the perception of one sense modality in terms of another. In literature, synaesthesia refers to a technique adopted by writers to present ideas, characters or places in such a manner that they appeal to more than one sense like hearing, seeing, smell etc. at a given time. In everyday language too we find many examples of synaesthesia. We commonly hear phrases like ‘loud colors’, ‘frozen silence’ and ‘warm colors’, ‘bitter cold’ etc. Empirical cognitive studies have proved that synaesthetic representations both in literature and everyday languages are constrained ie. they do not map randomly among sensory domains. From the beginning of the 20th century Synaesthesia has been a research domain both in literature and structural linguistics. However the exploration of cognitive mechanisms motivating synaesthesia, have made it an important topic in 21st century cognitive linguistics and literary studies. Synaesthetic metaphors are linguistic representations of those mental mechanisms, the study of which reveals invaluable facts about perception, cognition and conceptualization. According to the main tenets of cognitive approach to language and literature, unified and similar cognitive mechanisms are active both in everyday language and literature, and synaesthesia is one of those cognitive mechanisms. Main objective of the present research is to answer the following questions: What types of sense transfers are accessible in Persian synaesthetic metaphors. How are these types of sense transfers cognitively explained. What are the results of cross-linguistic comparative study of synaestetic metaphors based on the existing observations? Methodology: The present research employs a cognitive - corpus based method, and the theoretical framework adopted to analyze linguistic synaesthesia is the contemporary theory of metaphor, where conceptual metaphor is the result of systemic mappings across cognitive domains. Persian Language Data- base (PLDB) in the Institute for Humanities and Cultural Studies which consists mainly of Persian modern prose, is searched for synaesthetic metaphors. Then for each metaphorical structure, the source and target domains are determined. Then sense transfers are identified and the types of synaesthetic metaphors recognized. Findings: Persian synaesthetic metaphors conform to the hierarchical distribution principle, according to which transfers tend to go from touch to taste to smell to sound and to sight, not vice versa. In other words mapping from more accessible or basic concepts onto less accessible or less basic ones seems more natural. Furthermore the most frequent target domain in Persian synaesthetic metaphors is sound. Certain characteristics of Persian synaesthetic metaphors are comparable with existing related researches carried on English, French, Hungarian and Chinese synaesthetic metaphors. Conclusion: Cognitive corpus based approaches to linguistic synaesthesia, are applicable to stylistics and literary criticism and this recent research domain is an efficient approach to study cross linguistic variations to find out which of the five senses is dominant cross linguistically and cross culturally as the target domain in metaphorical mappings , and so forth receiving dominance in conceptualizations.Keywords: cognitive semantics, conceptual metaphor, synaesthesia, corpus based approach
Procedia PDF Downloads 562494 Toxicity and Biodegradability of Veterinary Antibiotic Tiamulin
Authors: Gabriela Kalcikova, Igor Bosevski, Ula Rozman, Andreja Zgajnar Gotvajn
Abstract:
Antibiotics are extensively used in human medicine and also in animal husbandry to prevent or control infections. Recently, a lot of attention has been put on veterinary antibiotics, because their global consumption is increasing and it is expected to be 106.600 tons in 2030. Most of veterinary antibiotics are introduced into the environment via animal manure, which is used as fertilizer. One of such veterinary antibiotics is tiamulin. It is used the form of fumarate for treatment of pig and poultry. It is used against prophylaxis of dysentery, pneumonia and mycroplasmal infections, but its environmental impact is practically unknown. Tiamulin has been found very persistent in animal manure and thus it is expected that can be, during rainfalls, transported into the aquatic environment and affect various organisms. For assessment of its environmental impact, it is necessary to evaluate its biodegradability and toxicity to various organisms from different levels of a food chain. Therefore, the aim of our study was to evaluate ready biodegradability and toxicity of tiamulin fumarate to various organisms. Bioassay used included luminescent bacterium Vibrio fischeri heterotrophic and nitrifying microorganisms of activated sludge, water flea Daphnia magna and duckweed Lemna minor. For each species, EC₅₀ values were calculated. Biodegradability test was used for determination of ready biodegradability and it provides information about biodegradability of tiamulin under the most common environmental conditions. Results of our study showed that tiamulin differently affects selected organisms. The most sensitive organisms were water fleas with 48hEC₅₀ = 14.2 ± 4.8 mg/L and duckweed with 168hEC₅₀ = 22.6 ± 0.8 mg/L. Higher concentrations of tiamulin (from 10 mg/L) significantly affected photosynthetic pigments content in duckweed and concentrations above 80 mg/L cause visible chlorosis. It is in agreement with previous studies showing significant effect of tiamulin on green algae and cyanobacteria. Tiamuline has a low effect on microorganisms. The lower toxicity was observed for heterotrophic microorganisms (30minEC₅₀ = 1656 ± 296 mg/L), than Vibrio fisheri (30minEC₅₀ = 492 ± 21) and the most sensitive organisms were nitrifying microorganisms (30minEC₅₀ = 183 ± 127 mg/L). The reason is most probably the mode of action of tiamulin being effective to gram-positive bacteria while gram-negative (e.g., Vibrio fisheri) are more tolerant to tiamulin. Biodegradation of tiamulin was very slow with a long lag-phase being 20 days. The maximal degradation reached 40 ± 2 % in 43 days of the test and tiamulin as other antibiotics (e.g. ciprofloxacin) are not easily biodegradable. Tiamulin is widely used antibiotic in veterinary medicine and thus present in the environment. According to our results, tiamulin can have negative effect on water fleas and duckweeds, but the concentrations are several magnitudes higher than that found in any environmental compartment. Tiamulin is low toxic to tested microorganisms, but it is very low biodegradable and thus possibly persistent in the environment.Keywords: antibiotics, biodegradability, tiamulin, toxicity
Procedia PDF Downloads 186