Search results for: visual methodologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2860

Search results for: visual methodologies

280 Emerging Identities: A Transformative ‘Green Zone’

Authors: Alessandra Swiny, Yiorgos Hadjichristou

Abstract:

There exists an on-going geographical scar creating a division through the Island of Cyprus and its capital, Nicosia. The currently amputated city center is accessed legally by the United Nations convoys, infiltrated only by Turkish and Greek Cypriot army scouts and illegal traders and scavengers. On Christmas day 1963 in Nicosia, Captain M. Hobden of the British Army took a green chinagraph pencil and on a large scale Joint Army-RAF map ‘marked’ the division. From then on this ‘buffer zone’ was called the ‘green line.' This once dividing form, separating the main communities of Greek and Turkish Cypriots from one another, has now been fully reclaimed by an autonomous intruder. It's currently most captivating inhabitant is nature. She keeps taking over, for the past fifty years indigenous and introduced fauna and flora thrive; trees emerge from rooftops and plants, bushes and flowers grow randomly through the once bustling market streets, allowing this ‘no man’s land’ to teem with wildlife. And where are its limits? The idea of fluidity is ever present; it encroaches into the urban and built environment that surrounds it, and notions of ownership and permanence are questioned. Its qualities have contributed significantly in the search for new ‘identities,' expressed in the emergence of new living conditions, be they real or surreal. Without being physically reachable, it can be glimpsed at through punctured peepholes, military bunker windows that act as enticing portals into an emotional and conceptual level of inhabitation. The zone is mystical and simultaneously suspended in time, it triggers people’s imagination, not just that of the two prevailing communities but also of immigrants, refugees, and visitors; it mesmerizes all who come within its proximity. The paper opens a discussion on the issues and the binary questions raised. What is natural and artificial; what is private and public; what is ephemeral and permanent? The ‘green line’ exists in a central fringe condition and can serve in mixing generations and groups of people; mingling functions of living with work and social interaction; merging nature and the human being in a new-found synergy of human hope and survival, allowing thus for new notions of place to be introduced. Questions seek to be answered, such as, “Is the impossibility of dwelling made possible, by interweaving these ‘in-between conditions’ into eloquently traced spaces?” The methodologies pursued are developed through academic research, professional practice projects, and students’ research/design work. Realized projects, case studies and other examples cited both nationally and internationally hold global and local applications. Both paths of the research deal with the explorative understanding of the impossibility of dwelling, testing the limits of its autonomy. The expected outcome of the experience evokes in the user a sense of a new urban landscape, created from human topographies that echo the voice of an emerging identity.

Keywords: urban wildlife, human topographies, buffer zone, no man’s land

Procedia PDF Downloads 171
279 Immersive Environment as an Occupant-Centric Tool for Architecture Criticism and Architectural Education

Authors: Golnoush Rostami, Farzam Kharvari

Abstract:

In recent years, developments in the field of architectural education have resulted in a shift from conventional teaching methods to alternative state-of-the-art approaches in teaching methods and strategies. Criticism in architecture has been a key player both in the profession and education, but it has been mostly offered by renowned individuals. Hence, not only students or other professionals but also critics themselves may not have the option to experience buildings and rely on available 2D materials, such as images and plans, that may not result in a holistic understanding and evaluation of buildings. On the other hand, immersive environments provide students and professionals the opportunity to experience buildings virtually and reflect their evaluation by experiencing rather than judging based on 2D materials. Therefore, the aim of this study is to compare the effect of experiencing buildings in immersive environments and 2D drawings, including images and plans, on architecture criticism and architectural education. As a result, three buildings that have parametric brick facades were studied through 2D materials and in Unreal Engine v. 24 as an immersive environment among 22 architecture students that were selected using convenient sampling and were divided into two equal groups using simple random sampling. This study used mixed methods, including quantitative and qualitative methods; the quantitative section was carried out by a questionnaire, and deep interviews were used for the qualitative section. A questionnaire was developed for measuring three constructs, including privacy regulation based on Altman’s theory, the sufficiency of illuminance levels in the building, and the visual status of the view (visually appealing views based on obstructions that may have been caused by facades). Furthermore, participants had the opportunity to reflect their understanding and evaluation of the buildings in individual interviews. Accordingly, the collected data from the questionnaires were analyzed using independent t-test and descriptive analyses in IBM SPSS Statistics v. 26, and interviews were analyzed using the content analysis method. The results of the interviews showed that the participants who experienced the buildings in the immersive environment were able to have a thorough and more precise evaluation of the buildings in comparison to those who studied them through 2D materials. Moreover, the analyses of the respondents’ questionnaires revealed that there were statistically significant differences between measured constructs among the two groups. The outcome of this study suggests that integrating immersive environments into the profession and architectural education as an effective and efficient tool for architecture criticism is vital since these environments allow users to have a holistic evaluation of buildings for vigorous and sound criticism.

Keywords: immersive environments, architecture criticism, architectural education, occupant-centric evaluation, pre-occupancy evaluation

Procedia PDF Downloads 109
278 Root Cause Analysis of a Catastrophically Failed Output Pin Bush Coupling of a Raw Material Conveyor Belt

Authors: Kaushal Kishore, Suman Mukhopadhyay, Susovan Das, Manashi Adhikary, Sandip Bhattacharyya

Abstract:

In integrated steel plants, conveyor belts are widely used for transferring raw materials from one location to another. An output pin bush coupling attached with a conveyor transferring iron ore fines and fluxes failed after two years of service life. This led to an operational delay of approximately 15 hours. This study is focused on failure analysis of the coupling and recommending counter-measures to prevent any such failures in the future. Investigation consisted of careful visual observation, checking of operating parameters, stress calculation and analysis, macro and micro-fractography, material characterizations like chemical and metallurgical analysis and tensile and impact testings. The fracture occurred from an unusually sharp double step. There were multiple corrosion pits near the step that aggravated the situation. Inner contact surface of the coupling revealed differential abrasion that created a macroscopic difference in the height of the component. This pointed towards misalignment of the coupling beyond a threshold limit. In addition to these design and installation issues, material of the coupling did not meet the quality standards. These were made up of grey cast iron having graphite morphology intermediate between random distribution (Type A) and rosette pattern (Type B). This manifested as a marked reduction in impact toughness and tensile strength of the component. These findings corroborated well with the brittle mode of fracture that might have occurred during minor impact loading while loading of conveyor belt with raw materials from height. Simulated study was conducted to examine the effect of corrosion pits on tensile and impact toughness of grey cast iron. It was observed that pitting marginally reduced tensile strength and ductility. However, there was marked (up to 45%) reduction in impact toughness due to pitting. Thus, it became evident that failure of the coupling occurred due to combination of factors like inferior material, misalignment, poor step design and corrosion pitting. Recommendation for life enhancement of coupling included the use of tougher SG 500/7 grade, incorporation of proper fillet radius for the step, correction of alignment and application of corrosion resistant organic coating to prevent pitting.

Keywords: brittle fracture, cast iron, coupling, double step, pitting, simulated impact tests

Procedia PDF Downloads 111
277 Death Due to Ulnar Artery Injury by Glassdoor: A Case Report

Authors: Ashok Kumar Rastogi

Abstract:

Glass is a material commonly used for Glassdoor, glass bottles, cookware, and containers. It can be harmful, as it is a hard and blunt object. Glass has been associated with severe injury and is a common cause of injuries warranting hospital visits to the emergency department (ED). These injuries can be accidental or intentionally inflicted. Broken glass injuries can be severe, even deadly. If broken glass shards fall out on your arm, it may cause fatal injuries. Case history: A 20-year-old male dead body was found aside the road, police informed, and a video recording ceased during an investigation. In the video recording, the person was in a drunken state (unable to walk and disoriented), wandering in the residential area road. He saw a barber shop, the shop door made of Glass. Suddenly, he hit the Glassdoor with his right hand forcefully. The Glassdoor broke into multiple pieces, and multiple injuries were seen over the right hand. Observations: Multiple small and large lacerations were seen over the right anterior part of the elbow. The main injury looked like an incised wound caused by a hard and sharp object. The main injury was noted as a laceration of size 13 x 06 cm bone deep, placed obliquely over the anteromedial aspect of the right elbow joint, its medial end at medial end of elbow joint while its anterior end was 04 cm below the elbow joint with laceration of underline brachialis muscles and complete transaction of ulnar artery and vein, skin margins looking sharply cut with irregular margins with tiny cuts at the medial lower border of laceration. Injuries were antemortem and fresh in nature, caused by hard and blunt objects but looking like hard and sharp objects. All organs were found pale, and the cause of death was shock and hemorrhage because of ulnar vessel injury. Conclusion: The findings of this case report highlight the potentially lethal consequences of glass injuries, especially those involving Glassdoors. The study underscores the importance of accurate interpretation and identification of wounds caused by Glass, as they may resemble injuries caused by other objects. It emphasizes the challenges faced by autopsy surgeons when determining the cause and manner of death in cases where visual evidence of injury is absent or when the weapon is not recovered. Ultimately, this case report serves as a reminder of the potential dangers posed by Glass and the importance of comprehensive forensic examinations.

Keywords: glassdoor, incised, wound, laceration, autopsy

Procedia PDF Downloads 59
276 Development and Application of an Intelligent Masonry Modulation in BIM Tools: Literature Review

Authors: Sara A. Ben Lashihar

Abstract:

The heritage building information modelling (HBIM) of the historical masonry buildings has expanded lately to meet the urgent needs for conservation and structural analysis. The masonry structures are unique features for ancient building architectures worldwide that have special cultural, spiritual, and historical significance. However, there is a research gap regarding the reliability of the HBIM modeling process of these structures. The HBIM modeling process of the masonry structures faces significant challenges due to the inherent complexity and uniqueness of their structural systems. Most of these processes are based on tracing the point clouds and rarely follow documents, archival records, or direct observation. The results of these techniques are highly abstracted models where the accuracy does not exceed LOD 200. The masonry assemblages, especially curved elements such as arches, vaults, and domes, are generally modeled with standard BIM components or in-place models, and the brick textures are graphically input. Hence, future investigation is necessary to establish a methodology to generate automatically parametric masonry components. These components are developed algorithmically according to mathematical and geometric accuracy and the validity of the survey data. The main aim of this paper is to provide a comprehensive review of the state of the art of the existing researches and papers that have been conducted on the HBIM modeling of the masonry structural elements and the latest approaches to achieve parametric models that have both the visual fidelity and high geometric accuracy. The paper reviewed more than 800 articles, proceedings papers, and book chapters focused on "HBIM and Masonry" keywords from 2017 to 2021. The studies were downloaded from well-known, trusted bibliographic databases such as Web of Science, Scopus, Dimensions, and Lens. As a starting point, a scientometric analysis was carried out using VOSViewer software. This software extracts the main keywords in these studies to retrieve the relevant works. It also calculates the strength of the relationships between these keywords. Subsequently, an in-depth qualitative review followed the studies with the highest frequency of occurrence and the strongest links with the topic, according to the VOSViewer's results. The qualitative review focused on the latest approaches and the future suggestions proposed in these researches. The findings of this paper can serve as a valuable reference for researchers, and BIM specialists, to make more accurate and reliable HBIM models for historic masonry buildings.

Keywords: HBIM, masonry, structure, modeling, automatic, approach, parametric

Procedia PDF Downloads 144
275 Bio-Hub Ecosystems: Investment Risk Analysis Using Monte Carlo Techno-Economic Analysis

Authors: Kimberly Samaha

Abstract:

In order to attract new types of investors into the emerging Bio-Economy, new methodologies to analyze investment risk are needed. The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding the use of biomass as a feedstock for power plants. This study looked at repurposing existing biomass-energy plants into Circular Zero-Waste Bio-Hub Ecosystems. A Bio-Hub model that first targets a ‘whole-tree’ approach and then looks at the circular economics of co-hosting diverse industries (wood processing, aquaculture, agriculture) in the vicinity of the Biomass Power Plants facilities. This study modeled the economics and risk strategies of cradle-to-cradle linkages to incorporate the value-chain effects on capital/operational expenditures and investment risk reductions using a proprietary techno-economic model that incorporates investment risk scenarios utilizing the Monte Carlo methodology. The study calculated the sequential increases in profitability for each additional co-host on an operating forestry-based biomass energy plant in West Enfield, Maine. Phase I starts with the base-line of forestry biomass to electricity only and was built up in stages to include co-hosts of a greenhouse and a land-based shrimp farm. Phase I incorporates CO2 and heat waste streams from the operating power plant in an analysis of lowering and stabilizing the operating costs of the agriculture and aquaculture co-hosts. Phase II analysis incorporated a jet-fuel biorefinery and its secondary slip-stream of biochar which would be developed into two additional bio-products: 1) A soil amendment compost for agriculture and 2) A biochar effluent filter for the aquaculture. The second part of the study applied the Monte Carlo risk methodology to illustrate how co-location derisks investment in an integrated Bio-Hub versus individual investments in stand-alone projects of energy, agriculture or aquaculture. The analyzed scenarios compared reductions in both Capital and Operating Expenditures, which stabilizes profits and reduces the investment risk associated with projects in energy, agriculture, and aquaculture. The major findings of this techno-economic modeling using the Monte Carlo technique resulted in the masterplan for the first Bio-Hub to be built in West Enfield, Maine. In 2018, the site was designated as an economic opportunity zone as part of a Federal Program, which allows for Capital Gains tax benefits for investments on the site. Bioenergy facilities are currently at a critical juncture where they have an opportunity to be repurposed into efficient, profitable and socially responsible investments, or be idled and scrapped. The Bio-hub Ecosystems techno-economic analysis model is a critical model to expedite new standards for investments in circular zero-waste projects. Profitable projects will expedite adoption and advance the critical transition from the current ‘take-make-dispose’ paradigm inherent in the energy, forestry and food industries to a more sustainable Bio-Economy paradigm that supports local and rural communities.

Keywords: bio-economy, investment risk, circular design, economic modelling

Procedia PDF Downloads 89
274 The Coexistence of Creativity and Information in Convergence Journalism: Pakistan's Evolving Media Landscape

Authors: Misha Mirza

Abstract:

In recent years, the definition of journalism in Pakistan has changed, so has the mindset of people and their approach towards a news story. For the audience, news has become more interesting than a drama or a film. This research thus provides an insight into Pakistan’s evolving media landscape. It tries not only to bring forth the outcomes of cross-platform cooperation among print and broadcast journalism but also gives an insight into the interactive data visualization techniques being used. The storytelling in journalism in Pakistan has evolved from depicting merely the truth to tweaking, fabricating and producing docu-dramas. It aims to look into how news is translated to a visual. Pakistan acquires a diverse cultural heritage and by engaging audience through media, this history translates into the storytelling platform today. The paper explains how journalists are thriving in a converging media environment and provides an analysis of the narratives in television talk shows today.’ Jack of all, master of none’ is being challenged by the journalists today. One has to be a quality information gatherer and an effective storyteller at the same time. Are journalists really looking more into what sells rather than what matters? Express Tribune is a very popular news platform among the youth. Not only is their newspaper more attractive than the competitors but also their style of narrative and interactive web stories lead to well-rounded news. Interviews are used as the basic methodology to get an insight into how data visualization is compassed. The quest for finding out the difference between visualization of information versus the visualization of knowledge has led the author to delve into the work of David McCandless in his book ‘Knowledge is beautiful’. Journalism in Pakistan has evolved from information to combining knowledge, infotainment and comedy. What is being criticized the most by the society most often becomes the breaking news. Circulation in today’s world is carried out in cultural and social networks. In recent times, we have come across many examples where people have gained overnight popularity by releasing songs with substandard lyrics or senseless videos perhaps because creativity has taken over information. This paper thus discusses the various platforms of convergence journalism from Pakistan’s perspective. The study concludes with proving how Pakistani pop culture Truck art is coexisting with all the platforms in convergent journalism. The changing media landscape thus challenges the basic rules of journalism. The slapstick humor and ‘jhatka’ in Pakistani talk shows has evolved from the Pakistani truck art poetry. Mobile journalism has taken over all the other mediums of journalism; however, the Pakistani culture coexists with the converging landscape.

Keywords: convergence journalism in Pakistan, data visualization, interactive narrative in Pakistani news, mobile journalism, Pakistan's truck art culture

Procedia PDF Downloads 264
273 Method of Complex Estimation of Text Perusal and Indicators of Reading Quality in Different Types of Commercials

Authors: Victor N. Anisimov, Lyubov A. Boyko, Yazgul R. Almukhametova, Natalia V. Galkina, Alexander V. Latanov

Abstract:

Modern commercials presented on billboards, TV and on the Internet contain a lot of information about the product or service in text form. However, this information cannot always be perceived and understood by consumers. Typical sociological focus group studies often cannot reveal important features of the interpretation and understanding information that has been read in text messages. In addition, there is no reliable method to determine the degree of understanding of the information contained in a text. Only the fact of viewing a text does not mean that consumer has perceived and understood the meaning of this text. At the same time, the tools based on marketing analysis allow only to indirectly estimate the process of reading and understanding a text. Therefore, the aim of this work is to develop a valid method of recording objective indicators in real time for assessing the fact of reading and the degree of text comprehension. Psychophysiological parameters recorded during text reading can form the basis for this objective method. We studied the relationship between multimodal psychophysiological parameters and the process of text comprehension during reading using the method of correlation analysis. We used eye-tracking technology to record eye movements parameters to estimate visual attention, electroencephalography (EEG) to assess cognitive load and polygraphic indicators (skin-galvanic reaction, SGR) that reflect the emotional state of the respondent during text reading. We revealed reliable interrelations between perceiving the information and the dynamics of psychophysiological parameters during reading the text in commercials. Eye movement parameters reflected the difficulties arising in respondents during perceiving ambiguous parts of text. EEG dynamics in rate of alpha band were related with cumulative effect of cognitive load. SGR dynamics were related with emotional state of the respondent and with the meaning of text and type of commercial. EEG and polygraph parameters together also reflected the mental difficulties of respondents in understanding text and showed significant differences in cases of low and high text comprehension. We also revealed differences in psychophysiological parameters for different type of commercials (static vs. video, financial vs. cinema vs. pharmaceutics vs. mobile communication, etc.). Conclusions: Our methodology allows to perform multimodal evaluation of text perusal and the quality of text reading in commercials. In general, our results indicate the possibility of designing an integral model to estimate the comprehension of reading the commercial text in percent scale based on all noticed markers.

Keywords: reading, commercials, eye movements, EEG, polygraphic indicators

Procedia PDF Downloads 145
272 Work Related Musculoskeletal Disorder: A Case Study of Office Computer Users in Nigerian Content Development and Monitoring Board, Yenagoa, Bayelsa State, Nigeria

Authors: Tamadu Perry Egedegu

Abstract:

Rapid growth in the use of electronic data has affected both the employee and work place. Our experience shows that jobs that have multiple risk factors have a greater likelihood of causing Work Related Musculoskeletal Disorder (WRMSDs), depending on the duration, frequency and/or magnitude of exposure to each. The study investigated musculoskeletal disorder among office workers. Thus, it is important that ergonomic risk factors be considered in light of their combined effect in causing or contributing to WRMSDs. Fast technological growth in the use of electronic system; have affected both workers and the work environment. Awkward posture and long hours in front of these visual display terminals can result in work-related musculoskeletal disorders (WRMSD). The study shall contribute to the awareness creation on the causes and consequences of WRMSDs due to lack of ergonomics training. The study was conducted using an observational cross-sectional design. A sample of 109 respondents was drawn from the target population through purposive sampling method. The sources of data were both primary and secondary. Primary data were collected through questionnaires and secondary data were sourced from journals, textbooks, and internet materials. Questionnaires were the main instrument for data collection and were designed in a YES or NO format according to the study objectives. Content validity approval was used to ensure that the variables were adequately covered. The reliability of the instrument was done through test-retest method, yielding a reliability index at 0.84. The data collected from the field were analyzed with a descriptive statistics of chart, percentage and mean. The study found that the most affected body regions were the upper back, followed by the lower back, neck, wrist, shoulder and eyes, while the least affected body parts were the knee calf and the ankle. Furthermore, the prevalence of work-related 'musculoskeletal' malfunctioning was linked with long working hours (6 - 8 hrs.) per day, lack of back support on their seats, glare on the monitor, inadequate regular break, repetitive motion of the upper limbs, and wrist when using the computer. Finally, based on these findings some recommendations were made to reduce the prevalent of WRMSDs among office workers.

Keywords: work related musculoskeletal disorder, Nigeria, office computer users, ergonomic risk factor

Procedia PDF Downloads 219
271 Photovoltaic Modules Fault Diagnosis Using Low-Cost Integrated Sensors

Authors: Marjila Burhanzoi, Kenta Onohara, Tomoaki Ikegami

Abstract:

Faults in photovoltaic (PV) modules should be detected to the greatest extent as early as possible. For that conventional fault detection methods such as electrical characterization, visual inspection, infrared (IR) imaging, ultraviolet fluorescence and electroluminescence (EL) imaging are used, but they either fail to detect the location or category of fault, or they require expensive equipment and are not convenient for onsite application. Hence, these methods are not convenient to use for monitoring small-scale PV systems. Therefore, low cost and efficient inspection techniques with the ability of onsite application are indispensable for PV modules. In this study in order to establish efficient inspection technique, correlation between faults and magnetic flux density on the surface is of crystalline PV modules are investigated. Magnetic flux on the surface of normal and faulted PV modules is measured under the short circuit and illuminated conditions using two different sensor devices. One device is made of small integrated sensors namely 9-axis motion tracking sensor with a 3-axis electronic compass embedded, an IR temperature sensor, an optical laser position sensor and a microcontroller. This device measures the X, Y and Z components of the magnetic flux density (Bx, By and Bz) few mm above the surface of a PV module and outputs the data as line graphs in LabVIEW program. The second device is made of a laser optical sensor and two magnetic line sensor modules consisting 16 pieces of magnetic sensors. This device scans the magnetic field on the surface of PV module and outputs the data as a 3D surface plot of the magnetic flux intensity in a LabVIEW program. A PC equipped with LabVIEW software is used for data acquisition and analysis for both devices. To show the effectiveness of this method, measured results are compared to those of a normal reference module and their EL images. Through the experiments it was confirmed that the magnetic field in the faulted areas have different profiles which can be clearly identified in the measured plots. Measurement results showed a perfect correlation with the EL images and using position sensors it identified the exact location of faults. This method was applied on different modules and various faults were detected using it. The proposed method owns the ability of on-site measurement and real-time diagnosis. Since simple sensors are used to make the device, it is low cost and convenient to be sued by small-scale or residential PV system owners.

Keywords: fault diagnosis, fault location, integrated sensors, PV modules

Procedia PDF Downloads 204
270 Achieving the Status of Total Sanitation in the Rural Nepalese Context: A Case Study from Amarapuri, Nepal

Authors: Ram Chandra Sah

Abstract:

Few years back, naturally a very beautiful country Nepal was facing a lot of problems related to the practice of open defecation (having no toilet) by almost 98% people of the country. Now, the scenario is changed. Government of Nepal set the target of achieving the situation of basic level sanitation (toilets) facilities by 2017 AD for which the Sanitation and Hygiene Master Plan (SHMP) was brought in 2011 AD with the major beauty as institutional set up formation, local formal authority leadership, locally formulated strategic plan; partnership, harmonized and coordinated approach to working; no subsidy or support at a blanket level, community and local institutions or organizations mobilization approaches. Now, the Open Defecation Free (ODF) movement in the country is at a full swing. The Sanitation and Hygiene Master Plan (SHMP) has clearly defined Total Sanitation which is accepted to be achieved if all the households of the related boundary have achieved the 6 indicators such as the access and regular use of toilet(s), regular use of soap and water at the critical moments, regular practice of use of food hygiene behavior, regular practice of use of water hygiene behavior including household level purification of locally available drinking water, maintenance of regular personal hygiene with household level waste management and the availability of the state of overall clean environment at the concerned level of boundary. Nepal has 3158 Village Development Committees (VDC's) in the rural areas. Amarapuri VDC was selected for the purpose of achieving Total Sanitation. Based on the SHMP; different methodologies such as updating of Village Water Sanitation and Hygiene Coordination Committee (V-WASH-CC), Total Sanitation team formation including one volunteer for each indicator, campaigning through settlement meetings, midterm evaluation which revealed the need of ward level 45 (5 for all 9 wards) additional volunteers, ward wise awareness creation with the help of the volunteers, informative notice boards and hoarding boards with related messages at important locations, management of separate waste disposal rings for decomposable and non-decomposable wastes, related messages dissemination through different types of local cultural programs, public toilets construction and management by community level; mobilization of local schools, offices and health posts; reward and recognition to contributors etc. were adopted for achieving 100 % coverage of each indicator. The VDC was in a very worse situation in 2010 with just 50, 30, 60, 60, 40, 30 percent coverage of the respective indicators and became the first VDC of the country declared with Total Sanitation. The expected result of 100 percent coverage of all the indicators was achieved in 2 years 10 months and 19 days. Experiences of Amarapuri were replicated successfully in different parts of the country and many VDC's have been declared with the achievement of Total Sanitation. Thus, Community Mobilized Total Sanitation Movement in Nepal has supported a lot for achieving a Total Sanitation situation of the country with a minimal cost and it is believed that the approach can be very useful for other developing or under developed countries of the world.

Keywords: community mobilized, open defecation free, sanitation and hygiene master plan, total sanitation

Procedia PDF Downloads 181
269 Developing Primary Care Datasets for a National Asthma Audit

Authors: Rachael Andrews, Viktoria McMillan, Shuaib Nasser, Christopher M. Roberts

Abstract:

Background and objective: The National Review of Asthma Deaths (NRAD) found that asthma management and care was inadequate in 26% of cases reviewed. Major shortfalls identified were adherence to national guidelines and standards and, particularly, the organisation of care, including supervision and monitoring in primary care, with 70% of cases reviewed having at least one avoidable factor in this area. 5.4 million people in the UK are diagnosed with and actively treated for asthma, and approximately 60,000 are admitted to hospital with acute exacerbations each year. The majority of people with asthma receive management and treatment solely in primary care. This has therefore created concern that many people within the UK are receiving sub-optimal asthma care resulting in unnecessary morbidity and risk of adverse outcome. NRAD concluded that a national asthma audit programme should be established to measure and improve processes, organisation, and outcomes of asthma care. Objective: To develop a primary care dataset enabling extraction of information from GP practices in Wales and providing robust data by which results and lessons could be drawn and drive service development and improvement. Methods: A multidisciplinary group of experts, including general practitioners, primary care organisation representatives, and asthma patients was formed and used as a source of governance and guidance. A review of asthma literature, guidance, and standards took place and was used to identify areas of asthma care which, if improved, would lead to better patient outcomes. Modified Delphi methodology was used to gain consensus from the expert group on which of the areas identified were to be prioritised, and an asthma patient and carer focus group held to seek views and feedback on areas of asthma care that were important to them. Areas of asthma care identified by both groups were mapped to asthma guidelines and standards to inform and develop primary and secondary care datasets covering both adult and pediatric care. Dataset development consisted of expert review and a targeted consultation process in order to seek broad stakeholder views and feedback. Results: Areas of asthma care identified as requiring prioritisation by the National Asthma Audit were: (i) Prescribing, (ii) Asthma diagnosis (iii) Asthma Reviews (iv) Personalised Asthma Action Plans (PAAPs) (v) Primary care follow-up after discharge from hospital (vi) Methodologies and primary care queries were developed to cover each of the areas of poor and variable asthma care identified and the queries designed to extract information directly from electronic patients’ records. Conclusion: This paper describes the methodological approach followed to develop primary care datasets for a National Asthma Audit. It sets out the principles behind the establishment of a National Asthma Audit programme in response to a national asthma mortality review and describes the development activities undertaken. Key process elements included: (i) mapping identified areas of poor and variable asthma care to national guidelines and standards, (ii) early engagement of experts, including clinicians and patients in the process, and (iii) targeted consultation of the queries to provide further insight into measures that were collectable, reproducible and relevant.

Keywords: asthma, primary care, general practice, dataset development

Procedia PDF Downloads 148
268 Biotite from Contact-Metamorphosed Rocks of the Dizi Series of the Greater Caucasus

Authors: Irakli Javakhishvili, Tamara Tsutsunava, Giorgi Beridze

Abstract:

The Caucasus is a component of the Mediterranean collision belt. The Dizi series is situated within the Greater Caucasian region of the Caucasus and crops out in the core of the Svaneti anticlinorium. The series was formed in the continental slope conditions on the southern passive margin of the small ocean basin. The Dizi series crops out on about 560 square km with the thickness 2000-2200 m. The rocks are faunally dated from the Devonian to the Triassic inclusive. The series is composed of terrigenous phyllitic schists, sandstones, quartzite aleurolites and lenses and interlayers of marbleized limestones. During the early Cimmerian orogeny, they underwent regional metamorphism of chlorite-sericite subfacies of greenschist facies. Typical minerals of metapelites are chlorite, sericite, augite, quartz, and tourmaline, but of basic rocks - actinolite, fibrolite, prehnite, calcite, and chlorite are developed. Into the Dizi series, polyphase intrusions of gabbros, diorites, quartz-diorites, syenite-diorites, syenites, and granitoids are intruded. Their K-Ar age dating (176-165Ma) points out that their formation corresponds to the Bathonian orogeny. The Dizi series is well-studied geologically, but very complicated processes of its regional and contact metamorphisms are insufficiently investigated. The aim of the authors was a detailed study of contact metamorphism processes of the series rocks. Investigations were accomplished applying the following methodologies: finding of key sections, a collection of material, microscopic study of samples, microprobe and structural analysis of minerals and X-ray determination of elements. The Dizi series rocks formed under the influence of the Bathonian magmatites on metapelites and carbonate-enriched rocks. They are represented by quartz, biotite, sericite, graphite, andalusite, muscovite, plagioclase, corundum, cordierite, clinopyroxene, hornblende, cummingtonite, actinolite, and tremolite bearing hornfels, marbles, and skarns. The contact metamorphism aureole reaches 350 meters. Biotite is developed only in contact-metamorphosed rocks and is a rather informative index mineral. In metapelites, biotite is formed as a result of the reaction between phengite, chlorite, and leucoxene, but in basites, it replaces actinolite or actinolite-hornblende. To study the compositional regularities of biotites, they were investigated from both - metapelites and metabasites. In total, biotite from the basites is characterized by an increased of titanium in contrast to biotite from metapelites. Biotites from metapelites are distinguished by an increased amount of aluminum. In biotites an increased amount of titanium and aluminum is observed as they approximate the contact, while their magnesia content decreases. Metapelite biotites are characterized by an increased amount of alumina in aluminum octahedrals, in contrast to biotite of the basites. In biotites of metapelites, the amount of tetrahedric aluminum is 28–34%, octahedral - 15–26%, and in basites tetrahedral aluminum is 28–33%, and octahedral 7–21%. As a result of the study of minerals, including biotite, from the contact-metamorphosed rocks of the Dizi series three exocontact zones with corresponding mineral assemblages were identified. It was established that contact metamorphism in the aureole of the Dizi series intrusions is going on at a significantly higher temperature and lower pressure than the regional metamorphism preceding the contact metamorphism.

Keywords: biotite, contact metamorphism, Dizi series, the Greater Caucasus

Procedia PDF Downloads 116
267 New Territories: Materiality and Craft from Natural Systems to Digital Experiments

Authors: Carla Aramouny

Abstract:

Digital fabrication, between advancements in software and machinery, is pushing practice today towards more complexity in design, allowing for unparalleled explorations. It is giving designers the immediate capacity to apply their imagined objects into physical results. Yet at no time have questions of material knowledge become more relevant and crucial, as technological advancements approach a radical re-invention of the design process. As more and more designers look towards tactile crafts for material know-how, an interest in natural behaviors has also emerged trying to embed intelligence from nature into the designed objects. Concerned with enhancing their immediate environment, designers today are pushing the boundaries of design by bringing in natural systems, materiality, and advanced fabrication as essential processes to produce active designs. New Territories, a yearly architecture and design course on digital design and materiality, allows students to explore processes of digital fabrication in intersection with natural systems and hands-on experiments. This paper will highlight the importance of learning from nature and from physical materiality in a digital design process, and how the simultaneous move between the digital and physical realms has become an essential design method. It will detail the work done over the course of three years, on themes of natural systems, crafts, concrete plasticity, and active composite materials. The aim throughout the course is to explore the design of products and active systems, be it modular facades, intelligent cladding, or adaptable seating, by embedding current digital technologies with an understanding of natural systems and a physical know-how of material behavior. From this aim, three main themes of inquiry have emerged through the varied explorations across the three years, each one approaching materiality and digital technologies through a different lens. The first theme involves crossing the study of naturals systems as precedents for intelligent formal assemblies with traditional crafts methods. The students worked on designing performative facade systems, starting from the study of relevant natural systems and a specific craft, and then using parametric modeling to develop their modular facades. The second theme looks at the cross of craft and digital technologies through form-finding techniques and elastic material properties, bringing in flexible formwork into the digital fabrication process. Students explored concrete plasticity and behaviors with natural references, as they worked on the design of an exterior seating installation using lightweight concrete composites and complex casting methods. The third theme brings in bio-composite material properties with additive fabrication and environmental concerns to create performative cladding systems. Students experimented in concrete composites materials, biomaterials and clay 3D printing to produce different cladding and tiling prototypes that actively enhance their immediate environment. This paper thus will detail the work process done by the students under these three themes of inquiry, describing their material experimentation, digital and analog design methodologies, and their final results. It aims to shed light on the persisting importance of material knowledge as it intersects with advanced digital fabrication and the significance of learning from natural systems and biological properties to embed an active performance in today’s design process.

Keywords: digital fabrication, design and craft, materiality, natural systems

Procedia PDF Downloads 108
266 Microbiological Analysis on Anatomical Specimens of Cats for Use in Veterinary Surgery

Authors: Raphael C. Zero, Marita V. Cardozo, Thiago A. S. S. Rocha, Mariana T. Kihara, Fernando A. Ávila, Fabrício S. Oliveira

Abstract:

There are several fixative and preservative solutions for use on cadavers, many of them using formaldehyde as the fixative or anatomical part preservative. In some countries, such as Brazil, this toxic agent has been increasingly restricted. The objective of this study was to microbiologically identify and quantify the key agents in tanks containing 96GL ethanol or sodium chloride solutions, used respectively as fixatives and preservatives of cat cadavers. Eight adult cat corpses, three females and five males, with an average weight of 4.3 kg, were used. After injection via the external common carotid artery (120 ml/kg, 95% 96GL ethyl alcohol and 5% pure glycerin), the cadavers were fixed in a plastic tank with 96GL ethanol for 60 days. After fixing, they were stored in a 30% sodium chloride aqueous solution for 120 days in a similar tank. Samples were collected at the start of the experiment - before the animals were placed in the ethanol tanks, and monthly thereafter. The bacterial count was performed by Pour Plate Method in BHI agar (Brain Heart Infusion) and the plates were incubated aerobically and anaerobically for 24h at 37ºC. MacConkey agar, SPS agar (Sulfite Polymyxin Sulfadizine) and MYP Agar Base were used to isolate the microorganisms. There was no microbial growth in the samples prior to alcohol fixation. After 30 days of fixation in the alcohol solution, total aerobic and anaerobic (<1.0 x 10 CFU/ml) were found and Pseudomonas sp., Staphylococcus sp., Clostridium sp. were the identified agents. After 60 days in the alcohol fixation solution, total aerobes (<1.0 x 10 CFU/ml) and total anaerobes (<2.2 x 10 CFU/mL) were found, and the identified agents were the same. After 30 days of storage in the aqueous solution of 30% sodium chloride, total aerobic (<5.2 x 10 CFU/ml) and total anaerobes (<3.7 x 10 CFU/mL) were found and the agents identified were Staphylococcus sp., Clostridium sp., and fungi. After 60 days of sodium chloride storage, total aerobic (<3.0 x 10 CFU / ml) and total anaerobes (<7.0 x 10 CFU/mL) were found and the identified agents remained the same: Staphylococcus sp., Clostridium sp., and fungi. The microbiological count was low and visual inspection did not reveal signs of contamination in the tanks. There was no strong odor or purification, which proved the technique to be microbiologically effective in fixing and preserving the cat cadavers for the four-month period in which they are provided to undergraduate students of University of Veterinary Medicine for surgery practice. All experimental procedures were approved by the Municipal Legal Department (protocol 02.2014.000027-1). The project was funded by FAPESP (protocol 2015-08259-9).

Keywords: anatomy, fixation, microbiology, small animal, surgery

Procedia PDF Downloads 261
265 Living by the Maramataka: Mahi Maramataka, Indigenous Environmental Knowledge Systems and Wellbeing

Authors: Ayla Hoeta

Abstract:

The focus of this research is mahi Maramataka, ‘the practices of Maramataka’ as a traditional and evolving knowledge system and its connection to whaanau oranga (wellbeing) and healing. Centering kaupapa Maaori methods and knowledge this research will explore how Maramataka can be used as a tool for oranga and healing for whaanau to engage with different environments aligned with Maramataka flow and optimal time based on the environment. Maramataka is an ancestral lunar environmental knowledge system rooted within korero tuku iho, Maaori creation stories, dating back to the beginning of time. The significance of Maramataka is the ancient environmental knowledge and the connecting energy flow of mauri (life force) between whenua (land), moana (ocean) and rangi (sky). The lunar component of the Maramataka is widely understood and highlights the different phases of the moon. Each moon phase is named with references to puurakau stories and environmental and ecological information. Marama, meaning moon and taka, meaning cycle, is used as a lunar and environmental calendar. There are lunar phases that are optimal for specific activities, such as the Tangaroa phase, a time of abundance and productivity and ocean-based activities like fishing. Other periods in the Maramataka, such as Rakaunui (full moon), connect the highest tides and highest energy of the lunar cycle, ideal for social, physical activity and particularly planting. Other phases like Tamatea are unpredictable whereas Whiro (new moon/s) is reflective, deep and cautious during the darkest nights. Whaanau, particularly in urban settings have become increasingly disconnected from the natural environment, the Maramataka has become a tool that they can connect to which offers an alternative to dominant perspectives of health and is an approach that is uniquely Maaori. In doing so, this research will raise awareness of oranga or lack of oranga, and lived experience of whaanau in Tamaki Makaurau - Aotearoa, on a journey to revival of Maramataka and healing. The research engages Hautu Waka as a methodology using the methods of ancient kaupapa Māori practises based on wayfinding and attunement with the natural environment. Using ancient ways of being, knowing, seeing and doing the Hautu Waka will centre kaupapa Maaori perspectives to process design, reflection and evaluation. The methods of Hautu Waka consists of five interweaving phases, 1) Te Rapunga (the search) in infinite potential, 2) Te Kitenga (the seeing), observations of and attunement to tohu 3) te whainga (the pursuit) and deeply exploring key tohu 4) te whiwhinga (the acquiring), of knowledge and clearer ideas, 5) Te Rawenga (the celebration), reflection and acknowledgement of the journey and achievements. This research is an expansion from my creative practices across whaanau-centred inquiry, to understand the benefits of Maramataka and how it can be embodied and practised in a modern-day context to support oranga and healing. Thus, the goal is to work with kaupapa Maaori methodologies to authenticate as a Maaori practitioner and researcher and allow an authentic indigenous approach to the exploration of Maramataka and through a kaupapa Maaori lens.

Keywords: maramataka (Maaori calendar), tangata (people), taiao (environment), whenua (land), whaanau (family), hautu waka (navigation framework)

Procedia PDF Downloads 50
264 The Significance of Urban Space in Death Trilogy of Alejandro González Iñárritu

Authors: Marta Kaprzyk

Abstract:

The cinema of Alejandro González Iñárritu hasn’t been subjected to a lot of detailed analysis yet, what makes it an exceptionally interesting research material. The purpose of this presentation is to discuss the significance of urban space in three films of this Mexican director, that forms Death Trilogy: ‘Amores Perros’ (2000), ‘21 Grams’ (2003) and ‘Babel’ (2006). The fact that in the aforementioned movies the urban space itself becomes an additional protagonist with its own identity, psychology and the ability to transform and affect other characters, in itself warrants for independent research and analysis. Independently, such mode of presenting urban space has another function; it enables the director to complement the rest of characters. The basis for methodology of this description of cinematographic space is to treat its visual layer as a point of departure for a detailed analysis. At the same time, the analysis itself will be supported by recognised academic theories concerning special issues, which are transformed here into essential tools necessary to describe the world (mise-en-scène) created by González Iñárritu. In ‘Amores perros’ the Mexico City serves as a scenery – a place full of contradictions- in the movie depicted as a modern conglomerate and an urban jungle, as well as a labyrinth of poverty and violence. In this work stylistic tropes can be found in an intertextual dialogue of the director with photographies of Nan Goldin and Mary Ellen Mark. The story recounted in ‘21 Grams’, the most tragic piece in the trilogy, is characterised by almost hyperrealistic sadism. It takes place in Memphis, which on the screen turns into an impersonal formation full of heterotopias described by Michel Foucault and non-places, as defined by Marc Augé in his essay. By contrast, the main urban space in ‘Babel’ is Tokio, which seems to perfectly correspond with the image of places discussed by Juhani Pallasmaa in his works concerning the reception of the architecture by ‘pathological senses’ in the modern (or, even more adequately, postmodern) world. It’s portrayed as a city full of buildings that look so surreal, that they seem to be completely unsuitable for the humans to move between them. Ultimately, the aim of this paper is to demonstrate the coherence of the manner in which González Iñárritu designs urban spaces in his Death Trilogy. In particular, the author attempts to examine the imperative role of the cities that form three specific microcosms in which the protagonists of the Mexican director live their overwhelming tragedies.

Keywords: cinematographic space, Death Trilogy, film Studies, González Iñárritu Alejandro, urban space

Procedia PDF Downloads 307
263 Call-Back Laterality and Bilaterality: Possible Screening Mammography Quality Metrics

Authors: Samson Munn, Virginia H. Kim, Huija Chen, Sean Maldonado, Michelle Kim, Paul Koscheski, Babak N. Kalantari, Gregory Eckel, Albert Lee

Abstract:

In terms of screening mammography quality, neither the portion of reports that advise call-back imaging that should be bilateral versus unilateral nor how much the unilateral call-backs may appropriately diverge from 50–50 (left versus right) is known. Many factors may affect detection laterality: display arrangement, reflections preferentially striking one display location, hanging protocols, seating positions with respect to others and displays, visual field cuts, health, etc. The call-back bilateral fraction may reflect radiologist experience (not in our data) or confidence level. Thus, laterality and bilaterality of call-backs advised in screening mammography reports could be worthy quality metrics. Here, laterality data did not reveal a concern until drilling down to individuals. Bilateral screening mammogram report recommendations by five breast imaging, attending radiologists at Harbor-UCLA Medical Center (Torrance, California) 9/1/15--8/31/16 and 9/1/16--8/31/17 were retrospectively reviewed. Recommended call-backs for bilateral versus unilateral, and for left versus right, findings were counted. Chi-square (χ²) statistic was applied. Year 1: of 2,665 bilateral screening mammograms, reports of 556 (20.9%) recommended call-back, of which 99 (17.8% of the 556) were for bilateral findings. Of the 457 unilateral recommendations, 222 (48.6%) regarded the left breast. Year 2: of 2,106 bilateral screening mammograms, reports of 439 (20.8%) recommended call-back, of which 65 (14.8% of the 439) were for bilateral findings. Of the 374 unilateral recommendations, 182 (48.7%) regarded the left breast. Individual ranges of call-backs that were bilateral were 13.2–23.3%, 10.2–22.5%, and 13.6–17.9%, by year(s) 1, 2, and 1+2, respectively; these ranges were unrelated to experience level; the two-year mean was 15.8% (SD=1.9%). The lowest χ² p value of the group's sidedness disparities years 1, 2, and 1+2 was > 0.4. Regarding four individual radiologists, the lowest p value was 0.42. However, the fifth radiologist disfavored the left, with p values of 0.21, 0.19, and 0.07, respectively; that radiologist had the greatest number of years of experience. There was a concerning, 93% likelihood that bias against left breast findings evidenced by one of our radiologists was not random. Notably, very soon after the period under review, he retired, presented with leukemia, and died. We call for research to be done, particularly by large departments with many radiologists, of two possible, new, quality metrics in screening mammography: laterality and bilaterality. (Images, patient outcomes, report validity, and radiologist psychological confidence levels were not assessed. No intervention nor subsequent data collection was conducted. This uncomplicated collection of data and simple appraisal were not designed, nor had there been any intention to develop or contribute, to generalizable knowledge (per U.S. DHHS 45 CFR, part 46)).

Keywords: mammography, screening mammography, quality, quality metrics, laterality

Procedia PDF Downloads 140
262 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability

Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley

Abstract:

The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.

Keywords: decision making, food safety, organoleptics, product compliance, quality assurance

Procedia PDF Downloads 171
261 The Influence of Hydrolyzed Cartilage Collagen on General Mobility and Wellbeing of an Active Population

Authors: Sara De Pelsmaeker, Catarina Ferreira da Silva, Janne Prawit

Abstract:

Recent studies show that enzymatically hydrolysed collagen is absorbed and distributed to joint tissues, where it has analgesic and active anti-inflammatory properties. Reviews of the associated relevant literature also support this theory. However, these studies are all using hydrolyzed collagen from animal hide or skin. This study looks into the effect of daily supplementation of hydrolyzed cartilage collagen (HCC), which has a different composition. A consumer study was set up using a double-blind placebo-controlled design with a control group using twice a day 0.5gr of maltodextrin and an experimental group using twice 0.5g of HCC, over a trial period of 12 weeks. A follow-up phase of 4 weeks without supplementation was taken into the experiment to investigate the ‘wash-out’ phase. As this consumer study was conducted during the lockdown periods, a specific app was designed to follow up with the participants. The app had the advantage that in this way, the motivation of the participants was enhanced and the drop-out range of participants was lower than normally seen in consumer studies. Participants were recruited via various sports and health clubs across the UK as we targeted a general population of people that considered themselves in good health. Exclusion criteria were ‘not experiencing any medical conditions’ and ‘not taking any prescribed medication’. A minimum requirement was that they regularly engaged in some level of physical activity. The participants had to log the type of activity that they conducted and the duration of the activity. Weekly, participants were providing feedback on their joint health and subjective pain using the validated pain measuring instrument Visual Analogue Scale (VAS). The weekly repoAbstract Public Health and Wellbeing Conferencerting section in the app was designed with simplicity and based on the accuracy demonstrated in previous similar studies to track subjective pain measures of participants. At the beginning of the trial, each participant indicated their baseline on joint pain. The results of this consumer study indicated that HCC significantly improved joint health and subjective pain scores compared to the placebo group. No significant differences were found between different demographic groups (age or gender). The level of activity, going from high intensive training to regular walking, did not significantly influence the effect of the HCC. The results of the wash-out phase indicated that when the participants stopped the HCC supplementation, their subjective pain scores increased again to the baseline. In conclusion, the results gave a positive indication that the daily supplementation of HCC can contribute to the overall mobility and wellbeing of a general active population

Keywords: VAS-score, food supplement, mobility, joint health

Procedia PDF Downloads 144
260 Moderating and Mediating Effects of Business Model Innovation Barriers during Crises: A Structural Equation Model Tested on German Chemical Start-Ups

Authors: Sarah Mueller-Saegebrecht, André Brendler

Abstract:

Business model innovation (BMI) as an intentional change of an existing business model (BM) or the design of a new BM is essential to a firm's development in dynamic markets. The relevance of BMI is also evident in the ongoing COVID-19 pandemic, in which start-ups, in particular, are affected by limited access to resources. However, first studies also show that they react faster to the pandemic than established firms. A strategy to successfully handle such threatening dynamic changes represents BMI. Entrepreneurship literature shows how and when firms should utilize BMI in times of crisis and which barriers one can expect during the BMI process. Nevertheless, research merging BMI barriers and crises is still underexplored. Specifically, further knowledge about antecedents and the effect of moderators on the BMI process is necessary for advancing BMI research. The addressed research gap of this study is two-folded: First, foundations to the subject on how different crises impact BM change intention exist, yet their analysis lacks the inclusion of barriers. Especially, entrepreneurship literature lacks knowledge about the individual perception of BMI barriers, which is essential to predict managerial reactions. Moreover, internal BMI barriers have been the focal point of current research, while external BMI barriers remain virtually understudied. Second, to date, BMI research is based on qualitative methodologies. Thus, a lack of quantitative work can specify and confirm these qualitative findings. By focusing on the crisis context, this study contributes to BMI literature by offering a first quantitative attempt to embed BMI barriers into a structural equation model. It measures managers' perception of BMI development and implementation barriers in the BMI process, asking the following research question: How does a manager's perception of BMI barriers influence BMI development and implementation in times of crisis? Two distinct research streams in economic literature explain how individuals react when perceiving a threat. "Prospect Theory" claims that managers demonstrate risk-seeking tendencies when facing a potential loss, and opposing "Threat-Rigidity Theory" suggests that managers demonstrate risk-averse behavior when facing a potential loss. This study quantitively tests which theory can best predict managers' BM reaction to a perceived crisis. Out of three in-depth interviews in the German chemical industry, 60 past BMIs were identified. The participating start-up managers gave insights into their start-up's strategic and operational functioning. After, each interviewee described crises that had already affected their BM. The participants explained how they conducted BMI to overcome these crises, which development and implementation barriers they faced, and how severe they perceived them, assessed on a 5-point Likert scale. In contrast to current research, results reveal that a higher perceived threat level of a crisis harms BM experimentation. Managers seem to conduct less BMI in times of crisis, whereby BMI development barriers dampen this relation. The structural equation model unveils a mediating role of BMI implementation barriers on the link between the intention to change a BM and the concrete BMI implementation. In conclusion, this study confirms the threat-rigidity theory.

Keywords: barrier perception, business model innovation, business model innovation barriers, crises, prospect theory, start-ups, structural equation model, threat-rigidity theory

Procedia PDF Downloads 77
259 Application of Multilinear Regression Analysis for Prediction of Synthetic Shear Wave Velocity Logs in Upper Assam Basin

Authors: Triveni Gogoi, Rima Chatterjee

Abstract:

Shear wave velocity (Vs) estimation is an important approach in the seismic exploration and characterization of a hydrocarbon reservoir. There are varying methods for prediction of S-wave velocity, if recorded S-wave log is not available. But all the available methods for Vs prediction are empirical mathematical models. Shear wave velocity can be estimated using P-wave velocity by applying Castagna’s equation, which is the most common approach. The constants used in Castagna’s equation vary for different lithologies and geological set-ups. In this study, multiple regression analysis has been used for estimation of S-wave velocity. The EMERGE module from Hampson-Russel software has been used here for generation of S-wave log. Both single attribute and multi attributes analysis have been carried out for generation of synthetic S-wave log in Upper Assam basin. Upper Assam basin situated in North Eastern India is one of the most important petroleum provinces of India. The present study was carried out using four wells of the study area. Out of these wells, S-wave velocity was available for three wells. The main objective of the present study is a prediction of shear wave velocities for wells where S-wave velocity information is not available. The three wells having S-wave velocity were first used to test the reliability of the method and the generated S-wave log was compared with actual S-wave log. Single attribute analysis has been carried out for these three wells within the depth range 1700-2100m, which corresponds to Barail group of Oligocene age. The Barail Group is the main target zone in this study, which is the primary producing reservoir of the basin. A system generated list of attributes with varying degrees of correlation appeared and the attribute with the highest correlation was concerned for the single attribute analysis. Crossplot between the attributes shows the variation of points from line of best fit. The final result of the analysis was compared with the available S-wave log, which shows a good visual fit with a correlation of 72%. Next multi-attribute analysis has been carried out for the same data using all the wells within the same analysis window. A high correlation of 85% has been observed between the output log from the analysis and the recorded S-wave. The almost perfect fit between the synthetic S-wave and the recorded S-wave log validates the reliability of the method. For further authentication, the generated S-wave data from the wells have been tied to the seismic and correlated them. Synthetic share wave log has been generated for the well M2 where S-wave is not available and it shows a good correlation with the seismic. Neutron porosity, density, AI and P-wave velocity are proved to be the most significant variables in this statistical method for S-wave generation. Multilinear regression method thus can be considered as a reliable technique for generation of shear wave velocity log in this study.

Keywords: Castagna's equation, multi linear regression, multi attribute analysis, shear wave logs

Procedia PDF Downloads 202
258 The Development of an Anaesthetic Crisis Manual for Acute Critical Events: A Pilot Study

Authors: Jacklyn Yek, Clara Tong, Shin Yuet Chong, Yee Yian Ong

Abstract:

Background: While emergency manuals and cognitive aids (CA) have been used in high-hazard industries for decades, this has been a nascent field in healthcare. CAs can potentially offset the large cognitive load involved in crisis resource management and possibly facilitate the efficient performance of key steps in treatment. A crisis manual was developed based on local guidelines and the latest evidence-based information and introduced to a tertiary hospital setting in Singapore. Hence, the objective of this study is to evaluate the effectiveness of the crisis manual in guiding response and management of critical events. Methods: 7 surgical teams were recruited to participate in a series of simulated emergencies in high-fidelity operating room simulator over the period of April to June 2018. All teams consisted of a surgical consultant and medical officer/registrar, anesthesia consultant and medical officer/registrar; as well as a circulating, scrub and anesthetic nurse. Each team performed a simulated operation in which 1 or more of the crisis events occurred. The teams were randomly assigned to a scenario of the crisis manual and all teams were deemed to be equal in experience and knowledge. Before the simulation, teams were instructed on proper checklist use but the use of the checklist was optional. Results: 7 simulation sessions were performed, consisting of the following scenarios: Airway fire, Massive Transfusion Protocol, Malignant Hyperthermia, Eclampsia, and Difficult Airway. Out of the 7 surgical teams, 2 teams made use of the crisis manual – of which both teams had encountered a ‘Malignant Hyperthermia’ scenario. These team members reflected that the crisis manual assisted allowed them to work in a team, especially being able to involve the surgical doctors who were unfamiliar with the condition and management. A run chart plotted showed a possible upward trend, suggesting that with increasing awareness and training, staff would become more likely to initiate the use of the crisis manual. Conclusion: Despite the high volume load in this tertiary hospital, certain crises remain rare and clinicians are often caught unprepared. A crisis manual is an effective tool and easy-to-use repository that can improve patient outcome and encourage teamwork. With training, familiarity would allow clinicians to be increasingly comfortable with reaching out for the crisis manual. More simulation training would need to be conducted to determine its effectiveness.

Keywords: crisis resource management, high fidelity simulation training, medical errors, visual aids

Procedia PDF Downloads 104
257 The Effect of Chloride Dioxide and High Concentration of CO2 Gas Injection on the Quality and Shelf-Life for Exporting Strawberry 'Maehyang' in Modified Atmosphere Condition

Authors: Hyuk Sung Yoon, In-Lee Choi, Mohammad Zahirul Islam, Jun Pill Baek, Ho-Min Kang

Abstract:

The strawberry ‘Maehyang’ cultivated in South Korea has been increased to export to Southeast Asia. The degradation of quality often occurs in strawberries during short export period. Botrytis cinerea has been known to cause major damage to the export strawberries and the disease was caused during shipping and distribution. This study was conducted to find out the sterilized effect of chlorine dioxide(ClO2) gas and high concentration of CO2 gas injection for ‘Maehyang’ strawberry and it was packaged with oxygen transmission rate (OTR) films. The strawberry was harvested at 80% color changed stage and packaged with OTR film and perforated film (control). The treatments were a MAP used by with 20,000 cc·m-2·day·atm OTR film and gas injection in packages. The gas type of ClO2 and CO2 were injected into OTR film packages, and treatments were 6 mg/L ClO2, 15% CO2, and they were combined. The treated strawberries were stored at 3℃ for 30 days. Fresh weight loss rate was less than 1% in all OTR film packages but it was more than 15% in a perforated film treatment that showed severe deterioration of visual quality during storage. Carbon dioxide concentration within a package showed approximately 15% of the maximum CO2 concentration in all treatments except control until the 21st day, it was the tolerated range of maximum CO2 concentration of strawberry in recommended CA or MA conditions. But, it increased to almost 50% on the 30th day. Oxygen concentration showed a decrease down to approximately 0% in all treatments except control for 25 days. Ethylene concentration was shown to be steady until the 17th day, but it quickly increased on the 17th day and dropped down on the final storage day (30th day). All treatments did not show any significant differences in gas treatments. Firmness increased in CO2 (15%) and ClO2 (6mg/L) + CO2 (15%) treatments during storage. It might be the effect of high concentration CO2 known by reducing decay and cell wall degradation. The soluble solid decreased in all treatments during storage. These results were caused to use up the sugar by the increase of respiration during storage. The titratable acidity showed a similarity in all treatments. Incidence of fungi was 0% in CO2 (15%) and ClO2 (6mg/L)+ CO2 (15%), but was more than 20% in a perforated film treatment. Consequently, The result indicates that Chloride Dioxide(ClO2) and high concentration of CO2 inhibited fungi growth. Due to the fact that fresh weight loss rate and incidence of fungi were lower, the ClO2(6mg/L)+ CO2(15%) prove to be most efficient in sterilization. These results suggest that Chloride Dioxide (ClO2) and high concentration of CO2 gas injection treatments were an effective decontamination technique for improving the safety of strawberries.

Keywords: chloride dioxide, high concentration of CO2, modified atmosphere condition, oxygen transmission rate films

Procedia PDF Downloads 325
256 Media Impression and Its Impact on Foreign Policy Making: A Study of India-China Relations

Authors: Rosni Lakandri

Abstract:

With the development of science and technology, there has been a complete transformation in the domain of information technology. Particularly after the Second World War and Cold War period, the role of media and communication technology in shaping the political, economic, socio-cultural proceedings across the world has been tremendous. It performs as a channel between the governing bodies of the state and the general masses. As we have seen the international community constantly talking about the onset of Asian Century, India and China happens to be the major player in this. Both have the civilization history, both are neighboring countries, both are witnessing a huge economic growth and, important of all, both are considered the rising powers of Asia. Not negating the fact that both countries have gone to war with each other in 1962 and the common people and even the policy makers of both the sides view each other till now from this prism. A huge contribution to this perception of people goes to the media coverage of both sides, even if there are spaces of cooperation which they share, the negative impacts of media has tended to influence the people’s opinion and government’s perception about each other. Therefore, analysis of media’s impression in both the countries becomes important in order to know their effect on the larger implications of foreign policy towards each other. It is usually said that media not only acts as the information provider but also acts as ombudsman to the government. They provide a kind of check and balance to the governments in taking proper decisions for the people of the country but in attempting to answer this hypothesis we have to analyze does the media really helps in shaping the political landscape of any country? Therefore, this study rests on the following questions; 1.How do China and India depict each other through their respective News media? 2.How much and what influences they make on the policy making process of each country? How do they shape the public opinion in both the countries? In order to address these enquiries, the study employs both primary and secondary sources available, and in generating data and other statistical information, primary sources like reports, government documents, and cartography, agreements between the governments have been used. Secondary sources like books, articles and other writings collected from various sources and opinion from visual media sources like news clippings, videos in this topic are also included as a source of on ground information as this study is not based on field study. As the findings suggest in case of China and India, media has certainly affected people’s knowledge about the political and diplomatic issues at the same time has affected the foreign policy making of both the countries. They have considerable impact on the foreign policy formulation and we can say there is some mediatization happening in foreign policy issues in both the countries.

Keywords: China, foreign policy, India, media, public opinion

Procedia PDF Downloads 138
255 Blended Learning Instructional Approach to Teach Pharmaceutical Calculations

Authors: Sini George

Abstract:

Active learning pedagogies are valued for their success in increasing 21st-century learners’ engagement, developing transferable skills like critical thinking or quantitative reasoning, and creating deeper and more lasting educational gains. 'Blended learning' is an active learning pedagogical approach in which direct instruction moves from the group learning space to the individual learning space, and the resulting group space is transformed into a dynamic, interactive learning environment where the educator guides students as they apply concepts and engage creatively in the subject matter. This project aimed to develop a blended learning instructional approach to teaching concepts around pharmaceutical calculations to year 1 pharmacy students. The wrong dose, strength or frequency of a medication accounts for almost a third of medication errors in the NHS therefore, progression to year 2 requires a 70% pass in this calculation test, in addition to the standard progression requirements. Many students were struggling to achieve this requirement in the past. It was also challenging to teach these concepts to students of a large class (> 130) with mixed mathematical abilities, especially within a traditional didactic lecture format. Therefore, short screencasts with voice-over of the lecturer were provided in advance of a total of four teaching sessions (two hours/session), incorporating core content of each session and talking through how they approached the calculations to model metacognition. Links to the screencasts were posted on the learning management. Viewership counts were used to determine that the students were indeed accessing and watching the screencasts on schedule. In the classroom, students had to apply the knowledge learned beforehand to a series of increasingly difficult set of questions. Students were then asked to create a question in group settings (two students/group) and to discuss the questions created by their peers in their groups to promote deep conceptual learning. Students were also given time for question-and-answer period to seek clarifications on the concepts covered. Student response to this instructional approach and their test grades were collected. After collecting and organizing the data, statistical analysis was carried out to calculate binomial statistics for the two data sets: the test grade for students who received blended learning instruction and the test grades for students who received instruction in a standard lecture format in class, to compare the effectiveness of each type of instruction. Student response and their performance data on the assessment indicate that the learning of content in the blended learning instructional approach led to higher levels of student engagement, satisfaction, and more substantial learning gains. The blended learning approach enabled each student to learn how to do calculations at their own pace freeing class time for interactive application of this knowledge. Although time-consuming for an instructor to implement, the findings of this research demonstrate that the blended learning instructional approach improves student academic outcomes and represents a valuable method to incorporate active learning methodologies while still maintaining broad content coverage. Satisfaction with this approach was high, and we are currently developing more pharmacy content for delivery in this format.

Keywords: active learning, blended learning, deep conceptual learning, instructional approach, metacognition, pharmaceutical calculations

Procedia PDF Downloads 153
254 Advantages of Matrix Solid Phase Dispersive (MSPD) Extraction Associated to MIPS versus MAE Liquid Extraction for the Simultaneous Analysis of PAHs, PCBs and Some Hydroxylated PAHs in Sediments

Authors: F. Portet-Koltalo, Y. Tian, I. Berger, C. Boulanger-Lecomte, A. Benamar, N. Machour

Abstract:

Sediments are complex environments which can accumulate a great variety of persistent toxic contaminants such as polychlorobiphenyles (PCBs), polycyclic aromatic hydrocarbons (PAHs) and some of their more toxic degradation metabolites such as hydroxylated PAHs (OH-PAHs). Owing to their composition, fine clayey sediments can be more difficult to extract than soils using conventional solvent extraction processes. So this study aimed to compare the potential of MSPD (matrix solid phase dispersive extraction) to extract PCBs, PAHs and OH-PAHs, in comparison with microwave assisted extraction (MAE). Methodologies: MAE extraction with various solvent mixtures was used to extract PCBs, PAHs and OH-PAHs from sediments in two runs, followed by two GC-MS analyses. MSPD consisted in crushing the dried sediment with dispersive agents, introducing the mixture in cartridges and eluting the target compounds with an appropriate volume of selected solvents. So MSPD combined with cartridges containing MIPs (molecularly imprinted polymers) designed for OH-PAHs was used to extract the three families of target compounds in only one run, followed by parallel analyses in GC-MS for PAHs/PCBs and HPLC-FLD for OH-PAHs. Results: MAE extraction was optimized to extract from clayey sediments, in two runs, PAHs/PCBs in one hand and OH-PAHs in the other hand. Indeed, the best conditions of extractions (mixtures of extracting solvents, temperature) were different if we consider the polarity and the thermodegradability of the different families of target contaminants: PAHs/PCBs were better extracted using an acetone/toluene 50/50 mixture at 130°C whereas OH-PAHs were better extracted using an acetonitrile/toluene 90/10 mixture at 100°C. Moreover, the two consecutive GC-MS analyses contributed to double the total analysis time. A matrix solid phase dispersive (MSPD) extraction procedure was also optimized, with the first objective of increasing the extraction recovery yields of PAHs and PCBs from fine-grained sediment. The crushing time (2-10 min), the nature of the dispersing agents added for purifying and increasing the extraction yields (Florisil, octadecylsilane, 3-chloropropyle, 4-benzylchloride), the nature and the volume of eluting solvents (methylene chloride, hexane, hexane/acetone…) were studied. It appeared that in the best conditions, MSPD was a better extraction method than MAE for PAHs and PCBs, with respectively, mean increases of 8.2% and 71%. This method was also faster, easier and less expensive. But the other advantage of MSPD was that it allowed to introduce easily, just after the first elution process of PAHs/PCBs, a step permitting the selective recovery of OH-PAHs. A cartridge containing MIPs designed for phenols was coupled to the cartridge containing the dispersed sediment, and various eluting solvents, different from those used for PAHs and PCBs, were tested to selectively concentrate and extract OH-PAHs. Thereafter OH-PAHs could be analyzed at the same time than PAHs and PCBs: the OH-PAH extract could be analyzed with HPLC-FLD, whereas the PAHs/PCBs extract was analyzed with GC-MS, adding only few minutes more to the total duration of the analytical process. Conclusion: MSPD associated to MIPs appeared to be an easy, fast and low expensive method, able to extract in one run a complex mixture of toxic apolar and more polar contaminants present in clayey fine-grained sediments, an environmental matrix which is generally difficult to analyze.

Keywords: contaminated fine-grained sediments, matrix solid phase dispersive extraction, microwave assisted extraction, molecularly imprinted polymers, multi-pollutant analysis

Procedia PDF Downloads 327
253 A Feature Clustering-Based Sequential Selection Approach for Color Texture Classification

Authors: Mohamed Alimoussa, Alice Porebski, Nicolas Vandenbroucke, Rachid Oulad Haj Thami, Sana El Fkihi

Abstract:

Color and texture are highly discriminant visual cues that provide an essential information in many types of images. Color texture representation and classification is therefore one of the most challenging problems in computer vision and image processing applications. Color textures can be represented in different color spaces by using multiple image descriptors which generate a high dimensional set of texture features. In order to reduce the dimensionality of the feature set, feature selection techniques can be used. The goal of feature selection is to find a relevant subset from an original feature space that can improve the accuracy and efficiency of a classification algorithm. Traditionally, feature selection is focused on removing irrelevant features, neglecting the possible redundancy between relevant ones. This is why some feature selection approaches prefer to use feature clustering analysis to aid and guide the search. These techniques can be divided into two categories. i) Feature clustering-based ranking algorithm uses feature clustering as an analysis that comes before feature ranking. Indeed, after dividing the feature set into groups, these approaches perform a feature ranking in order to select the most discriminant feature of each group. ii) Feature clustering-based subset search algorithms can use feature clustering following one of three strategies; as an initial step that comes before the search, binded and combined with the search or as the search alternative and replacement. In this paper, we propose a new feature clustering-based sequential selection approach for the purpose of color texture representation and classification. Our approach is a three step algorithm. First, irrelevant features are removed from the feature set thanks to a class-correlation measure. Then, introducing a new automatic feature clustering algorithm, the feature set is divided into several feature clusters. Finally, a sequential search algorithm, based on a filter model and a separability measure, builds a relevant and non redundant feature subset: at each step, a feature is selected and features of the same cluster are removed and thus not considered thereafter. This allows to significantly speed up the selection process since large number of redundant features are eliminated at each step. The proposed algorithm uses the clustering algorithm binded and combined with the search. Experiments using a combination of two well known texture descriptors, namely Haralick features extracted from Reduced Size Chromatic Co-occurence Matrices (RSCCMs) and features extracted from Local Binary patterns (LBP) image histograms, on five color texture data sets, Outex, NewBarktex, Parquet, Stex and USPtex demonstrate the efficiency of our method compared to seven of the state of the art methods in terms of accuracy and computation time.

Keywords: feature selection, color texture classification, feature clustering, color LBP, chromatic cooccurrence matrix

Procedia PDF Downloads 111
252 Deep Learning for Image Correction in Sparse-View Computed Tomography

Authors: Shubham Gogri, Lucia Florescu

Abstract:

Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.

Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net

Procedia PDF Downloads 127
251 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 106