Search results for: merging closure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 318

Search results for: merging closure

78 Using Rainfall Simulators to Design and Assess the Post-Mining Erosional Stability

Authors: Ashraf M. Khalifa, Hwat Bing So, Greg Maddocks

Abstract:

Changes to the mining environmental approvals process in Queensland have been rolled out under the MERFP Act (2018). This includes requirements for a Progressive Rehabilitation and Closure Plan (PRC Plan). Key considerations of the landform design report within the PRC Plan must include: (i) identification of materials available for landform rehabilitation, including their ability to achieve the required landform design outcomes, (ii) erosion assessments to determine landform heights, gradients, profiles, and material placement, (iii) slope profile design considering the interactions between soil erodibility, rainfall erosivity, landform height, gradient, and vegetation cover to identify acceptable erosion rates over a long-term average, (iv) an analysis of future stability based on the factors described above e.g., erosion and /or landform evolution modelling. ACARP funded an extensive and thorough erosion assessment program using rainfall simulators from 1998 to 2010. The ACARP program included laboratory assessment of 35 soil and spoil samples from 16 coal mines and samples from a gold mine in Queensland using 3 x 0.8 m laboratory rainfall simulator. The reliability of the laboratory rainfall simulator was verified through field measurements using larger flumes 20 x 5 meters and catchment scale measurements at three sites (3 different catchments, average area of 2.5 ha each). Soil cover systems are a primary component of a constructed mine landform. The primary functions of a soil cover system are to sustain vegetation and limit the infiltration of water and oxygen into underlying reactive mine waste. If the external surface of the landform erodes, the functions of the cover system cannot be maintained, and the cover system will most likely fail. Assessing a constructed landform’s potential ‘long-term’ erosion stability requires defensible erosion rate thresholds below which rehabilitation landform designs are considered acceptably erosion-resistant or ‘stable’. The process used to quantify erosion rates using rainfall simulators (flumes) to measure rill and inter-rill erosion on bulk samples under laboratory conditions or on in-situ material under field conditions will be explained.

Keywords: open-cut, mining, erosion, rainfall simulator

Procedia PDF Downloads 70
77 Sweepline Algorithm for Voronoi Diagram of Polygonal Sites

Authors: Dmitry A. Koptelov, Leonid M. Mestetskiy

Abstract:

Voronoi Diagram (VD) of finite set of disjoint simple polygons, called sites, is a partition of plane into loci (for each site at the locus) – regions, consisting of points that are closer to a given site than to all other. Set of polygons is a universal model for many applications in engineering, geoinformatics, design, computer vision, and graphics. VD of polygons construction usually done with a reduction to task of constructing VD of segments, for which there are effective O(n log n) algorithms for n segments. Preprocessing – constructing segments from polygons’ sides, and postprocessing – polygon’s loci construction by merging the loci of the sides of each polygon are also included in reduction. This approach doesn’t take into account two specific properties of the resulting segment sites. Firstly, all this segments are connected in pairs in the vertices of the polygons. Secondly, on the one side of each segment lies the interior of the polygon. The polygon is obviously included in its locus. Using this properties in the algorithm for VD construction is a resource to reduce computations. The article proposes an algorithm for the direct construction of VD of polygonal sites. Algorithm is based on sweepline paradigm, allowing to effectively take into account these properties. The solution is performed based on reduction. Preprocessing is the constructing of set of sites from vertices and edges of polygons. Each site has an orientation such that the interior of the polygon lies to the left of it. Proposed algorithm constructs VD for set of oriented sites with sweepline paradigm. Postprocessing is a selecting of edges of this VD formed by the centers of empty circles touching different polygons. Improving the efficiency of the proposed sweepline algorithm in comparison with the general Fortune algorithm is achieved due to the following fundamental solutions: 1. Algorithm constructs only such VD edges, which are on the outside of polygons. Concept of oriented sites allowed to avoid construction of VD edges located inside the polygons. 2. The list of events in sweepline algorithm has a special property: the majority of events are connected with “medium” polygon vertices, where one incident polygon side lies behind the sweepline and the other in front of it. The proposed algorithm processes such events in constant time and not in logarithmic time, as in the general Fortune algorithm. The proposed algorithm is fully implemented and tested on a large number of examples. The high reliability and efficiency of the algorithm is also confirmed by computational experiments with complex sets of several thousand polygons. It should be noted that, despite the considerable time that has passed since the publication of Fortune's algorithm in 1986, a full-scale implementation of this algorithm for an arbitrary set of segment sites has not been made. The proposed algorithm fills this gap for an important special case - a set of sites formed by polygons.

Keywords: voronoi diagram, sweepline, polygon sites, fortunes' algorithm, segment sites

Procedia PDF Downloads 143
76 Complimentary Allusions: Shawl Scenes in Rossellini, Lean, Fellini, Kubrick, and Bertolucci Films

Authors: Misha Nedeljkovich

Abstract:

In the film’s famous scene (Roma città aperta-1945), Pina (Anna Magnani) collapses in the street when machined-gunned by a German soldier. Her son Marcello (Vito Annchiarico) tries to revive her. Her death is signaling not closure, but the cycle of life; Marcello saves Francesco with the shawl taken from his mother’s corpse. One pivotal scene in Brief Encounter (1945) occurs in the apartment of Alec’s (Trevor Howard) friend Stephen (Valentine Dyall), when Stephen returns to catch Alec and Laura (Celia Johnson) together alone. David Lean directs this scene using her shawl as a sign of in flagrante delicto. In La Strada (1954), Gelsomina (Giulietta Masina) was waving good bye when her mother sensing impending doom changed her mind and desperately tried to stop her waving back with her shawl: Don’t go my daughter! Your shawl! Your shawl! Gelsomina refuses to return, waving back: It’s time to go! Stanley Kubrick’s tale of a boxer who crosses a mobster to win the heart of a lady, Killer’s Kiss (1955), reminds us that Times Square used to contain sweaty boxing gyms and dance halls. The film’s longest Times Square interlude is its oddest: the boxer Davie Gordon played by Jamie Smith has his shawl stolen by two playful men in Shriners’ hats who are silent except for one who blows a harmonica, faintly heard over honking cabs and overheard conversations. This long sequence appears to be joining in on directors’ shawl conversations with Kubrick’s own twist. Principle characters will never know why all this happened to them that evening. Love, death, happiness and everlasting misery all of that is caused by Dave’s shawl. Finally, the decade of cinematic shawl conversations conclude in Betolucci’s Before the Revolution (Prima della rivoluzione–1964). One of his character’s lifts up a shawl asking if this was a Rossellini’s shawl. I argue that exploring complimentary allusions in a film where directors are acknowledging their own great debt to another film or filmmaker will further our knowledge of film history adding both depth and resonance to the great works in cinema.

Keywords: allusions, Bertolucci, Fellini, homage, Kubrick, lean, Rossellini

Procedia PDF Downloads 365
75 Web and Smart Phone-based Platform Combining Artificial Intelligence and Satellite Remote Sensing Data to Geoenable Villages for Crop Health Monitoring

Authors: Siddhartha Khare, Nitish Kr Boro, Omm Animesh Mishra

Abstract:

Recent food price hikes may signal the end of an era of predictable global grain crop plenty due to climate change, population expansion, and dietary changes. Food consumption will treble in 20 years, requiring enormous production expenditures. Climate and the atmosphere changed owing to rainfall and seasonal cycles in the past decade. India's tropical agricultural relies on evapotranspiration and monsoons. In places with limited resources, the global environmental change affects agricultural productivity and farmers' capacity to adjust to changing moisture patterns. Motivated by these difficulties, satellite remote sensing might be combined with near-surface imaging data (smartphones, UAVs, and PhenoCams) to enable phenological monitoring and fast evaluations of field-level consequences of extreme weather events on smallholder agriculture output. To accomplish this technique, we must digitally map all communities agricultural boundaries and crop kinds. With the improvement of satellite remote sensing technologies, a geo-referenced database may be created for rural Indian agriculture fields. Using AI, we can design digital agricultural solutions for individual farms. Main objective is to Geo-enable each farm along with their seasonal crop information by combining Artificial Intelligence (AI) with satellite and near-surface data and then prepare long term crop monitoring through in-depth field analysis and scanning of fields with satellite derived vegetation indices. We developed an AI based algorithm to understand the timelapse based growth of vegetation using PhenoCam or Smartphone based images. We developed an android platform where user can collect images of their fields based on the android application. These images will be sent to our local server, and then further AI based processing will be done at our server. We are creating digital boundaries of individual farms and connecting these farms with our smart phone application to collect information about farmers and their crops in each season. We are extracting satellite-based information for each farm from Google earth engine APIs and merging this data with our data of tested crops from our app according to their farm’s locations and create a database which will provide the data of quality of crops from their location.

Keywords: artificial intelligence, satellite remote sensing, crop monitoring, android and web application

Procedia PDF Downloads 63
74 A Mixed Method Approach for Modeling Entry Capacity at Rotary Intersections

Authors: Antonio Pratelli, Lorenzo Brocchini, Reginald Roy Souleyrette

Abstract:

A rotary is a traffic circle intersection where vehicles entering from branches give priority to circulating flow. Vehicles entering the intersection from converging roads move around the central island and weave out of the circle into their desired exiting branch. This creates merging and diverging conflicts among any entry and its successive exit, i.e., a section. Therefore, rotary capacity models are usually based on the weaving of the different movements in any section of the circle, and the maximum rate of flow value is then related to each weaving section of the rotary. Nevertheless, the single-section capacity value does not lead to the typical performance characteristics of the intersection, such as the entry average delay which is directly linked to its level of service. From another point of view, modern roundabout capacity models are based on the limitation of the flow entering from the single entrance due to the amount of flow circulating in front of the entrance itself. Modern roundabouts capacity models generally lead also to a performance evaluation. This paper aims to incorporate a modern roundabout capacity model into an old rotary capacity method to obtain from the latter the single input capacity and ultimately achieve the related performance indicators. Put simply; the main objective is to calculate the average delay of each single roundabout entrance to apply the most common Highway Capacity Manual, or HCM, criteria. The paper is organized as follows: firstly, the rotary and roundabout capacity models are sketched, and it has made a brief introduction to the model combination technique with some practical instances. The successive section is deserved to summarize the TRRL old rotary capacity model and the most recent HCM-7th modern roundabout capacity model. Then, the two models are combined through an iteration-based algorithm, especially set-up and linked to the concept of roundabout total capacity, i.e., the value reached due to a traffic flow pattern leading to the simultaneous congestion of all roundabout entrances. The solution is the average delay for each entrance of the rotary, by which is estimated its respective level of service. In view of further experimental applications, at this research stage, a collection of existing rotary intersections operating with the priority-to-circle rule has already started, both in the US and in Italy. The rotaries have been selected by direct inspection of aerial photos through a map viewer, namely Google Earth. Each instance has been recorded by location, general urban or rural, and its main geometrical patterns. Finally, conclusion remarks are drawn, and a discussion on some further research developments has opened.

Keywords: mixed methods, old rotary and modern roundabout capacity models, total capacity algorithm, level of service estimation

Procedia PDF Downloads 50
73 The Impact of COVID-19 on Italian Tourism: the Current Scenario, Opportunity and Future Tourism Organizational Strategies

Authors: Marco Camilli

Abstract:

This article examines the impact of the pandemic outbreak of COVID-19 in the tourism sector in Italy, analyzing the current scenario, the government decisions and the private company reaction for the summer season 2020. The framework of the data analyzed shows how massive it’s the impact of the pandemic outbreak in the tourism revenue, and the weaknesses of the measures proposed. Keywords Travel &Tourism, Transportation, Sustainability, COVID-19, Businesses Introduction The current COVID-19 scenario shows a shocking situation for the tourism and transportation sectors: it could be the most affected by the Coronavirus in Italy. According to forecasts, depending on the duration of the epidemic outbreak and the lockdown strategy applied by the Government, businesses in the supply chain could lose between 24 and 66 billion in turnover in the period of 2020-21, with huge diversified impacts at the national and regional level. Many tourist companies are on the verge of survival and if there are no massive measures by the government they risk closure. Data analysis The tourism and transport sector could be among the sectors most damaged by Covid-19 in Italy. Considering the two-year period 2020-21, companies operating in the travel & tourism sector (Tour operator, Travel Agencies, Hotel, Guides, Bus Company, etc..) could in suffer losses in revenues of 24 to 64 billion euros, especially in the sectors such as the travel agencies, hotel and rental. According to Statista Research Department, from April 2020 estimated that the coronavirus (COVID-19) pandemic will have a significant impact on revenues of the tourism industry in Italy. Revenues are expected to decrease by over 40 billion euros in the first semester of 2020, compared to the same period of the previous year. According to the study, hotel and non-hotel accommodations will experience the highest loss. Revenues of this sector are expected to decrease by 13 billion euros compared to the first semester of 2019 when accommodations registered revenues for about 17 billion euros. According to Statista.com, in 2020, Italy is expected to register a decrease of roughly 28.5 million tourist arrivals due to the impact of coronavirus (COVID-19) on the country's tourist sector. According to the estimate, the region of Veneto will record the highest drop with a decrease of roughly 4.61 million arrivals. Similarly, Lombardy is expected to register a decrease of about 3.87 million arrivals in 2020.

Keywords: travel and tourism, sustainability, COVID-19, businesses, transportation

Procedia PDF Downloads 161
72 Placement Characteristics of Major Stream Vehicular Traffic at Median Openings

Authors: Tathagatha Khan, Smruti Sourava Mohapatra

Abstract:

Median openings are provided in raised median of multilane roads to facilitate U-turn movement. The U-turn movement is a highly complex and risky maneuver because U-turning vehicle (minor stream) makes 180° turns at median openings and merge with the approaching through traffic (major stream). A U-turning vehicle requires a suitable gap in the major stream to merge, and during this process, the possibility of merging conflict develops. Therefore, these median openings are potential hot spot of conflict and posses concern pertaining to safety. The traffic at the median openings could be managed efficiently with enhanced safety when the capacity of a traffic facility has been estimated correctly. The capacity of U-turns at median openings is estimated by Harder’s formula, which requires three basic parameters namely critical gap, follow up time and conflict flow rate. The estimation of conflicting flow rate under mixed traffic condition is very much complicated due to absence of lane discipline and discourteous behavior of the drivers. The understanding of placement of major stream vehicles at median opening is very much important for the estimation of conflicting traffic faced by U-turning movement. The placement data of major stream vehicles at different section in 4-lane and 6-lane divided multilane roads were collected. All the test sections were free from the effect of intersection, bus stop, parked vehicles, curvature, pedestrian movements or any other side friction. For the purpose of analysis, all the vehicles were divided into 6 categories such as motorized 2W, autorickshaw (3-W), small car, big car, light commercial vehicle, and heavy vehicle. For the collection of placement data of major stream vehicles, the entire road width was divided into sections of 25 cm each and these were numbered seriatim from the pavement edge (curbside) to the end of the road. The placement major stream vehicle crossing the reference line was recorded by video graphic technique on various weekdays. The collected data for individual category of vehicles at all the test sections were converted into a frequency table with a class interval of 25 cm each and the placement frequency curve. Separate distribution fittings were tried for 4- lane and 6-lane divided roads. The variation of major stream traffic volume on the placement characteristics of major stream vehicles has also been explored. The findings of this study will be helpful to determine the conflict volume at the median openings. So, the present work holds significance in traffic planning, operation and design to alleviate the bottleneck, prospect of collision and delay at median opening in general and at median opening in developing countries in particular.

Keywords: median opening, U-turn, conflicting traffic, placement, mixed traffic

Procedia PDF Downloads 110
71 Congruency of English Teachers’ Assessments Vis-à-Vis 21st Century Skills Assessment Standards

Authors: Mary Jane Suarez

Abstract:

A massive educational overhaul has taken place at the onset of the 21st century addressing the mismatches of employability skills with that of scholastic skills taught in schools. For a community to thrive in an ever-developing economy, the teaching of the necessary skills for job competencies should be realized by every educational institution. However, in harnessing 21st-century skills amongst learners, teachers, who often lack familiarity and thorough insights into the emerging 21st-century skills, are chained with the restraint of the need to comprehend the physiognomies of 21st-century skills learning and the requisite to implement the tenets of 21st-century skills teaching. With the endeavor to espouse 21st-century skills learning and teaching, a United States-based national coalition called Partnership 21st Century Skills (P21) has identified the four most important skills in 21st-century learning: critical thinking, communication, collaboration, and creativity and innovation with an established framework for 21st-century skills standards. Assessment of skills is the lifeblood of every teaching and learning encounter. It is correspondingly crucial to look at the 21st century standards and the assessment guides recognized by P21 to ensure that learners are 21st century ready. This mixed-method study sought to discover and describe what classroom assessments were used by English teachers in a public secondary school in the Philippines with course offerings on science, technology, engineering, and mathematics (STEM). The research evaluated the assessment tools implemented by English teachers and how these assessment tools were congruent to the 21st assessment standards of P21. A convergent parallel design was used to analyze assessment tools and practices in four phases. In the data-gathering phase, survey questionnaires, document reviews, interviews, and classroom observations were used to gather quantitative and qualitative data simultaneously, and how assessment tools and practices were consistent with the P21 framework with the four Cs as its foci. In the analysis phase, the data were treated using mean, frequency, and percentage. In the merging and interpretation phases, a side-by-side comparison was used to identify convergent and divergent aspects of the results. In conclusion, the results yielded assessments tools and practices that were inconsistent, if not at all, used by teachers. Findings showed that there were inconsistencies in implementing authentic assessments, there was a scarcity of using a rubric to critically assess 21st skills in both language and literature subjects, there were incongruencies in using portfolio and self-reflective assessments, there was an exclusion of intercultural aspects in assessing the four Cs and the lack of integrating collaboration in formative and summative assessments. As a recommendation, a harmonized assessment scheme of P21 skills was fashioned for teachers to plan, implement, and monitor classroom assessments of 21st-century skills, ensuring the alignment of such assessments to P21 standards for the furtherance of the institution’s thrust to effectively integrate 21st-century skills assessment standards to its curricula.

Keywords: 21st-century skills, 21st-century skills assessments, assessment standards, congruency, four Cs

Procedia PDF Downloads 161
70 Resilience of the American Agriculture Sector

Authors: Dipak Subedi, Anil Giri, Christine Whitt, Tia McDonald

Abstract:

This study aims to understand the impact of the pandemic on the overall economic well-being of the agricultural sector of the United States. The two key metrics used to examine the economic well-being are the bankruptcy rate of the U.S. farm operations and the operating profit margin. One of the primary reasons for farm operations (in the U.S.) to file for bankruptcy is continuous negative profit or a significant decrease in profit. The pandemic caused significant supply and demand shocks in the domestic market. Furthermore, the ongoing trade disruptions, especially with China, also impacted the prices of agricultural commodities. The significantly reduced demand for ethanol and closure of meat processing plants affected both livestock and crop producers. This study uses data from courts to examine the bankruptcy rate over time of U.S. farm operations. Preliminary results suggest there wasn’t an increase in farm operations filing for bankruptcy in 2020. This was most likely because of record high Government payments to producers in 2020. The Federal Government made direct payments of more than $45 billion in 2020. One commonly used economic metric to measure farm profitability is the operating profit margin (OPM). Operating profit margin measures profitability as a share of the total value of production and government payments. The Economic Research Service of the United States Department of Agriculture defines a farm operation to be in a) a high-risk zone if the OPM is less than 10 percent and b) a low-risk zone if the OPM is higher than 25 percent. For this study, OPM was calculated for small, medium, and large-scale farm operations using the data from the Agriculture Resource Management Survey (OPM). Results show that except for small family farms, the share of farms in high-risk zone decreased in 2020 compared to the most recent non-pandemic year, 2019. This was most likely due to higher commodity prices at the end of 2020 and record-high government payments. Further investigation suggests a lower share of smaller farm operations receiving lower average government payments resulting in a large share (over 70 percent) being in the critical zone. This study should be of interest to multiple stakeholders, including policymakers across the globe, as it shows the resilience of the U.S. agricultural system as well as (some) impact of government payments.

Keywords: U.S. farm sector, COVID-19, operating profit margin, farm bankruptcy, ag finance, government payments to the farm sector

Procedia PDF Downloads 64
69 Covid-19 Lockdown Experience of Elderly Female as Reflected in Their Artwork

Authors: Liat Shamri-Zeevi, Neta Ram-Vlasov

Abstract:

Today the world as a whole is attempting to cope with the COVID-19, which has affected all facets of personal and social life from country-wide confinement to maintaining social distance and taking protective measures to maintain hygiene. One of the populations faced with the most severe restrictions is seniors. Various studies have shown that creativity plays a crucial role in dealing with crisis events. Painting - regardless of media - allows for emotional and cognitive processing of these situations, and enables the expression of experiences in a tangible creative way that conveys and endows meaning to the artwork. The current study was conducted in Israel immediately after a 6-week lockdown. It was designed to specifically examine the impact of the COVID-19 pandemic on the quality of life of elderly women as reflected in their artworks. The sample was composed of 21 Israeli women aged 60-90, in good mental health (without diagnosed dementia or Alzheimer's), all of whom were Hebrew-speaking, and retired with an extended family, who indicated that they painted and had engaged in artwork on an ongoing basis throughout the lockdown (from March 12 to May 30, 2020). The participants' artworks were collected, and a semi-structured in-depth interview was conducted that lasted one to two hours. The participants were asked about their feelings during the pandemic and the artworks they produced during this time, and completed a questionnaire on well-being and mental health. The initial analysis of the interviews and artworks revealed themes related to the specific role of each piece of artwork. The first theme included notions that the artwork was an activity and a framework for doing, which supported positive emotions, and provided a sense of vitality during the closure. Most of the participants painted images of nature and growth which were ascribed concrete and symbolic meaning. The second theme was that the artwork enabled the processing of difficult and /or conflicting emotions related to the situation, including anxiety about death and loneliness that were symbolically expressed in the artworks, such as images of the Corona virus and the respiratory machines. The third theme suggested that the time and space prompted by the lockdown gave the participants time for a gathering together of the self, and freed up time for creative activities. Many participants stated that they painted more and more frequently during the Corona lockdown. At the conference, additional themes and findings will be presented.

Keywords: Corona virus, artwork, quality of life of elderly

Procedia PDF Downloads 116
68 Developing Curricula for Signaling and Communication Course at Malaysia Railway Academy (MyRA) through Industrial Collaboration Program

Authors: Mohd Fairus Humar, Ibrahim Sulaiman, Pedro Cruz, Hasry Harun

Abstract:

This paper presents the propose knowledge transfer program on railway signaling and communication by Original Equipment Manufacturer (OEM) Thales Portugal. The fundamental issue is that there is no rail related course offered by local universities and colleges in Malaysia which could be an option to pursue student career path. Currently, dedicated trainings related to the rail technology are provided by in-house training academies established by the respective rail operators such as Malaysia Railway Academy (MyRA) and Rapid Rail Training Centre. In this matter, the content of training and facilities need to be strengthened to keep up-to-date with the dynamic evolvement of the rail technology. This is because rail products have evolved to be more sophisticated and embedded with high technology components which no longer exist in the mechanical form alone but combined with electronics, information technology and others. These demand for a workforce imbued with knowledge, multi-skills and competency to deal with specialized technical areas. Talent is needed to support sustainability in Southeast Asia. Keeping the above factors in mind, an Industrial Collaboration Program (ICP) was carried out to transfer knowledge on curricula of railway signaling and communication to a selected railway operators and tertiary educational institution in Malaysia. In order to achieve the aim, a partnership was formed between Technical Depository Agency (TDA), Thales Portugal and MyRA for two years with three main stages of program implementation comprising of: i) training on basic railway signaling and communication for 1 month with Thales in Malaysia; ii) training on advance railway signaling and communication for 4 months with Thales in Portugal and; iii) a series of workshop. Two workshops were convened to develop and harmonize curricula of railway signaling and communication course and were followed by one training for installation equipment of railway signaling and Controlled Train Centre (CTC) system from Thales Portugal. With active involvement from Technical Depository Agency (TDA), railway operators, universities, and colleges, in planning, executing, monitoring, control and closure, the program module of railway signaling and communication course with a lab railway signaling field equipment and CTC simulator were developed. Through this program, contributions from various parties help to build committed societies to engage important issues in relation to railway signaling and communication towards creating a sustainable future.

Keywords: knowledge transfer program, railway signaling and communication, curricula, module and teaching aid simulator

Procedia PDF Downloads 161
67 Revolutionizing Healthcare Communication: The Transformative Role of Natural Language Processing and Artificial Intelligence

Authors: Halimat M. Ajose-Adeogun, Zaynab A. Bello

Abstract:

Artificial Intelligence (AI) and Natural Language Processing (NLP) have transformed computer language comprehension, allowing computers to comprehend spoken and written language with human-like cognition. NLP, a multidisciplinary area that combines rule-based linguistics, machine learning, and deep learning, enables computers to analyze and comprehend human language. NLP applications in medicine range from tackling issues in electronic health records (EHR) and psychiatry to improving diagnostic precision in orthopedic surgery and optimizing clinical procedures with novel technologies like chatbots. The technology shows promise in a variety of medical sectors, including quicker access to medical records, faster decision-making for healthcare personnel, diagnosing dysplasia in Barrett's esophagus, boosting radiology report quality, and so on. However, successful adoption requires training for healthcare workers, fostering a deep understanding of NLP components, and highlighting the significance of validation before actual application. Despite prevailing challenges, continuous multidisciplinary research and collaboration are critical for overcoming restrictions and paving the way for the revolutionary integration of NLP into medical practice. This integration has the potential to improve patient care, research outcomes, and administrative efficiency. The research methodology includes using NLP techniques for Sentiment Analysis and Emotion Recognition, such as evaluating text or audio data to determine the sentiment and emotional nuances communicated by users, which is essential for designing a responsive and sympathetic chatbot. Furthermore, the project includes the adoption of a Personalized Intervention strategy, in which chatbots are designed to personalize responses by merging NLP algorithms with specific user profiles, treatment history, and emotional states. The synergy between NLP and personalized medicine principles is critical for tailoring chatbot interactions to each user's demands and conditions, hence increasing the efficacy of mental health care. A detailed survey corroborated this synergy, revealing a remarkable 20% increase in patient satisfaction levels and a 30% reduction in workloads for healthcare practitioners. The poll, which focused on health outcomes and was administered to both patients and healthcare professionals, highlights the improved efficiency and favorable influence on the broader healthcare ecosystem.

Keywords: natural language processing, artificial intelligence, healthcare communication, electronic health records, patient care

Procedia PDF Downloads 41
66 Achieving Product Robustness through Variation Simulation: An Industrial Case Study

Authors: Narendra Akhadkar, Philippe Delcambre

Abstract:

In power protection and control products, assembly process variations due to the individual parts manufactured from single or multi-cavity tooling is a major problem. The dimensional and geometrical variations on the individual parts, in the form of manufacturing tolerances and assembly tolerances, are sources of clearance in the kinematic joints, polarization effect in the joints, and tolerance stack-up. All these variations adversely affect the quality of product, functionality, cost, and time-to-market. Variation simulation analysis may be used in the early product design stage to predict such uncertainties. Usually, variations exist in both manufacturing processes and materials. In the tolerance analysis, the effect of the dimensional and geometrical variations of the individual parts on the functional characteristics (conditions) of the final assembled products are studied. A functional characteristic of the product may be affected by a set of interrelated dimensions (functional parameters) that usually form a geometrical closure in a 3D chain. In power protection and control products, the prerequisite is: when a fault occurs in the electrical network, the product must respond quickly to react and break the circuit to clear the fault. Usually, the response time is in milliseconds. Any failure in clearing the fault may result in severe damage to the equipment or network, and human safety is at stake. In this article, we have investigated two important functional characteristics that are associated with the robust performance of the product. It is demonstrated that the experimental data obtained at the Schneider Electric Laboratory prove the very good prediction capabilities of the variation simulation performed using CETOL (tolerance analysis software) in an industrial context. Especially, this study allows design engineers to better understand the critical parts in the product that needs to be manufactured with good, capable tolerances. On the contrary, some parts are not critical for the functional characteristics (conditions) of the product and may lead to some reduction of the manufacturing cost, ensuring robust performance. The capable tolerancing is one of the most important aspects in product and manufacturing process design. In the case of miniature circuit breaker (MCB), the product's quality and its robustness are mainly impacted by two aspects: (1) allocation of design tolerances between the components of a mechanical assembly and (2) manufacturing tolerances in the intermediate machining steps of component fabrication.

Keywords: geometrical variation, product robustness, tolerance analysis, variation simulation

Procedia PDF Downloads 139
65 The Use of Social Stories and Digital Technology as Interventions for Autistic Children; A State-Of-The-Art Review and Qualitative Data Analysis

Authors: S. Hussain, C. Grieco, M. Brosnan

Abstract:

Background and Aims: Autism is a complex neurobehavioural disorder, characterised by impairments in the development of language and communication skills. The study involved a state-of-art systematic review, in addition to qualitative data analysis, to establish the evidence for social stories as an intervention strategy for autistic children. An up-to-date review of the use of digital technologies in the delivery of interventions to autistic children was also carried out; to propose the efficacy of digital technologies and the use of social stories to improve intervention outcomes for autistic children. Methods: Two student researchers reviewed a range of randomised control trials and observational studies. The aim of the review was to establish if there was adequate evidence to justify recommending social stories to autistic patients. Students devised their own search strategies to be used across a range of search engines, including Ovid-Medline, Google Scholar and PubMed. Students then critically appraised the generated literature. Additionally, qualitative data obtained from a comprehensive online questionnaire on social stories was also thematically analysed. The thematic analysis was carried out independently by each researcher, using a ‘bottom-up’ approach, meaning contributors read and analysed responses to questions and devised semantic themes from reading the responses to a given question. The researchers then placed each response into a semantic theme or sub-theme. The students then joined to discuss the merging of their theme headings. The Inter-rater reliability (IRR) was calculated before and after theme headings were merged, giving IRR for pre- and post-discussion. Lastly, the thematic analysis was assessed by a third researcher, who is a professor of psychology and the director for the ‘Centre for Applied Autism Research’ at the University of Bath. Results: A review of the literature, as well as thematic analysis of qualitative data found supporting evidence for social story use. The thematic analysis uncovered some interesting themes from the questionnaire responses, relating to the reasons why social stories were used and the factors influencing their effectiveness in each case. However, overall, the evidence for digital technologies interventions was limited, and the literature could not prove a causal link between better intervention outcomes for autistic children and the use of technologies. However, they did offer valid proposed theories for the suitability of digital technologies for autistic children. Conclusions: Overall, the review concluded that there was adequate evidence to justify advising the use of social stories with autistic children. The role of digital technologies is clearly a fast-emerging field and appears to be a promising method of intervention for autistic children; however, it should not yet be considered an evidence-based approach. The students, using this research, developed ideas on social story interventions which aim to help autistic children.

Keywords: autistic children, digital technologies, intervention, social stories

Procedia PDF Downloads 93
64 Students’ Speech Anxiety in Blended Learning

Authors: Mary Jane B. Suarez

Abstract:

Public speaking anxiety (PSA), also known as speech anxiety, is innumerably persistent in any traditional communication classes, especially for students who learn English as a second language. The speech anxiety intensifies when communication skills assessments have taken their toll in an online or a remote mode of learning due to the perils of the COVID-19 virus. Both teachers and students have experienced vast ambiguity on how to realize a still effective way to teach and learn speaking skills amidst the pandemic. Communication skills assessments like public speaking, oral presentations, and student reporting have defined their new meaning using Google Meet, Zoom, and other online platforms. Though using such technologies has paved for more creative ways for students to acquire and develop communication skills, the effectiveness of using such assessment tools stands in question. This mixed method study aimed to determine the factors that affected the public speaking skills of students in a communication class, to probe on the assessment gaps in assessing speaking skills of students attending online classes vis-à-vis the implementation of remote and blended modalities of learning, and to recommend ways on how to address the public speaking anxieties of students in performing a speaking task online and to bridge the assessment gaps based on the outcome of the study in order to achieve a smooth segue from online to on-ground instructions maneuvering towards a much better post-pandemic academic milieu. Using a convergent parallel design, both quantitative and qualitative data were reconciled by probing on the public speaking anxiety of students and the potential assessment gaps encountered in an online English communication class under remote and blended learning. There were four phases in applying the convergent parallel design. The first phase was the data collection, where both quantitative and qualitative data were collected using document reviews and focus group discussions. The second phase was data analysis, where quantitative data was treated using statistical testing, particularly frequency, percentage, and mean by using Microsoft Excel application and IBM Statistical Package for Social Sciences (SPSS) version 19, and qualitative data was examined using thematic analysis. The third phase was the merging of data analysis results to amalgamate varying comparisons between desired learning competencies versus the actual learning competencies of students. Finally, the fourth phase was the interpretation of merged data that led to the findings that there was a significantly high percentage of students' public speaking anxiety whenever students would deliver speaking tasks online. There were also assessment gaps identified by comparing the desired learning competencies of the formative and alternative assessments implemented and the actual speaking performances of students that showed evidence that public speaking anxiety of students was not properly identified and processed.

Keywords: blended learning, communication skills assessment, public speaking anxiety, speech anxiety

Procedia PDF Downloads 70
63 A Q-Methodology Approach for the Evaluation of Land Administration Mergers

Authors: Tsitsi Nyukurayi Muparari, Walter Timo De Vries, Jaap Zevenbergen

Abstract:

The nature of Land administration accommodates diversity in terms of both spatial data handling activities and the expertise involved, which supposedly aims to satisfy the unpredictable demands of land data and the diverse demands of the customers arising from the land. However, it is known that strategic decisions of restructuring are in most cases repelled in favour of complex structures that strive to accommodate professional diversity and diverse roles in the field of Land administration. Yet despite of this widely accepted knowledge, there is scanty theoretical knowledge concerning the psychological methodologies that can extract the deeper perceptions from the diverse spatial expertise in order to explain the invisible control arm of the polarised reception of the ideas of change. This paper evaluates Q methodology in the context of a cadastre and land registry merger (under one agency) using the Swedish cadastral system as a case study. Precisely, the aim of this paper is to evaluate the effectiveness of Q methodology towards modelling the diverse psychological perceptions of spatial professionals who are in a widely contested decision of merging the cadastre and land registry components of Land administration using the Swedish cadastral system as a case study. An empirical approach that is prescribed by Q methodology starts with the concourse development, followed by the design of statements and q sort instrument, selection of the participants, the q-sorting exercise, factor extraction by PQMethod and finally narrative development by logic of abduction. The paper uses 36 statements developed from a dominant competing value theory that stands out on its reliability and validity, purposively selects 19 participants to do the Qsorting exercise, proceeds with factor extraction from the diversity using varimax rotation and judgemental rotation provided by PQMethod and effect the narrative construction using the logic abduction. The findings from the diverse perceptions from cadastral professionals in the merger decision of land registry and cadastre components in Sweden’s mapping agency (Lantmäteriet) shows that focus is rather inclined on the perfection of the relationship between the legal expertise and technical spatial expertise. There is much emphasis on tradition, loyalty and communication attributes which concern the organisation’s internal environment rather than innovation and market attributes that reveals customer behavior and needs arising from the changing humankind-land needs. It can be concluded that Q methodology offers effective tools that pursues a psychological approach for the evaluation and gradations of the decisions of strategic change through extracting the local perceptions of spatial expertise.

Keywords: cadastre, factor extraction, land administration merger, land registry, q-methodology, rotation

Procedia PDF Downloads 161
62 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 72
61 Visual Aid and Imagery Ramification on Decision Making: An Exploratory Study Applicable in Emergency Situations

Authors: Priyanka Bharti

Abstract:

Decades ago designs were based on common sense and tradition, but after an enhancement in visualization technology and research, we are now able to comprehend the cognitive ability involved in the decoding of the visual information. However, many fields in visuals need intense research to deliver an efficient explanation for the events. Visuals are an information representation mode through images, symbols and graphics. It plays an impactful role in decision making by facilitating quick recognition, comprehension, and analysis of a situation. They enhance problem-solving capabilities by enabling the processing of more data without overloading the decision maker. As research proves that, visuals offer an improved learning environment by a factor of 400 compared to textual information. Visual information engages learners at a cognitive level and triggers the imagination, which enables the user to process the information faster (visuals are processed 60,000 times faster in the brain than text). Appropriate information, visualization, and its presentation are known to aid and intensify the decision-making process for the users. However, most literature discusses the role of visual aids in comprehension and decision making during normal conditions alone. Unlike emergencies, in a normal situation (e.g. our day to day life) users are neither exposed to stringent time constraints nor face the anxiety of survival and have sufficient time to evaluate various alternatives before making any decision. An emergency is an unexpected probably fatal real-life situation which may inflict serious ramifications on both human life and material possessions unless corrective measures are taken instantly. The situation demands the exposed user to negotiate in a dynamic and unstable scenario in the absence or lack of any preparation, but still, take swift and appropriate decisions to save life/lives or possessions. But the resulting stress and anxiety restricts cue sampling, decreases vigilance, reduces the capacity of working memory, causes premature closure in evaluating alternative options, and results in task shedding. Limited time, uncertainty, high stakes and vague goals negatively affect cognitive abilities to take appropriate decisions. More so, theory of natural decision making by experts has been understood with far more depth than that of an ordinary user. Therefore, in this study, the author aims to understand the role of visual aids in supporting rapid comprehension to take appropriate decisions during an emergency situation.

Keywords: cognition, visual, decision making, graphics, recognition

Procedia PDF Downloads 240
60 A Survey of Digital Health Companies: Opportunities and Business Model Challenges

Authors: Iris Xiaohong Quan

Abstract:

The global digital health market reached 175 billion U.S. dollars in 2019, and is expected to grow at about 25% CAGR to over 650 billion USD by 2025. Different terms such as digital health, e-health, mHealth, telehealth have been used in the field, which can sometimes cause confusion. The term digital health was originally introduced to refer specifically to the use of interactive media, tools, platforms, applications, and solutions that are connected to the Internet to address health concerns of providers as well as consumers. While mHealth emphasizes the use of mobile phones in healthcare, telehealth means using technology to remotely deliver clinical health services to patients. According to FDA, “the broad scope of digital health includes categories such as mobile health (mHealth), health information technology (IT), wearable devices, telehealth and telemedicine, and personalized medicine.” Some researchers believe that digital health is nothing else but the cultural transformation healthcare has been going through in the 21st century because of digital health technologies that provide data to both patients and medical professionals. As digital health is burgeoning, but research in the area is still inadequate, our paper aims to clear the definition confusion and provide an overall picture of digital health companies. We further investigate how business models are designed and differentiated in the emerging digital health sector. Both quantitative and qualitative methods are adopted in the research. For the quantitative analysis, our research data came from two databases Crunchbase and CBInsights, which are well-recognized information sources for researchers, entrepreneurs, managers, and investors. We searched a few keywords in the Crunchbase database based on companies’ self-description: digital health, e-health, and telehealth. A search of “digital health” returned 941 unique results, “e-health” returned 167 companies, while “telehealth” 427. We also searched the CBInsights database for similar information. After merging and removing duplicate ones and cleaning up the database, we came up with a list of 1464 companies as digital health companies. A qualitative method will be used to complement the quantitative analysis. We will do an in-depth case analysis of three successful unicorn digital health companies to understand how business models evolve and discuss the challenges faced in this sector. Our research returned some interesting findings. For instance, we found that 86% of the digital health startups were founded in the recent decade since 2010. 75% of the digital health companies have less than 50 employees, and almost 50% with less than 10 employees. This shows that digital health companies are relatively young and small in scale. On the business model analysis, while traditional healthcare businesses emphasize the so-called “3P”—patient, physicians, and payer, digital health companies extend to “5p” by adding patents, which is the result of technology requirements (such as the development of artificial intelligence models), and platform, which is an effective value creation approach to bring the stakeholders together. Our case analysis will detail the 5p framework and contribute to the extant knowledge on business models in the healthcare industry.

Keywords: digital health, business models, entrepreneurship opportunities, healthcare

Procedia PDF Downloads 152
59 Hybrid Living: Emerging Out of the Crises and Divisions

Authors: Yiorgos Hadjichristou

Abstract:

The paper will focus on the hybrid living typologies which are brought about due to the Global Crisis. Mixing of the generations and the groups of people, mingling the functions of living with working and socializing, merging the act of living in synergy with the urban realm and its constituent elements will be the springboard of proposing an essential sustainable housing approach and the respective urban development. The thematic will be based on methodologies developed both on the academic, educational environment including participation of students’ research and on the practical aspect of architecture including case studies executed by the author in the island of Cyprus. Both paths of the research will deal with the explorative understanding of the hybrid ways of living, testing the limits of its autonomy. The evolution of the living typologies into substantial hybrid entities, will deal with the understanding of new ways of living which include among others: re-introduction of natural phenomena, accommodation of the activity of work and services in the living realm, interchange of public and private, injections of communal events into the individual living territories. The issues and the binary questions raised by what is natural and artificial, what is private and what public, what is ephemeral and what permanent and all the in-between conditions are eloquently traced in the everyday life in the island. Additionally, given the situation of Cyprus with the eminent scar of the dividing ‘Green line’ and the waiting of the ‘ghost city’ of Famagusta to be resurrected, the conventional way of understanding the limits and the definitions of the properties is irreversibly shaken. The situation is further aggravated by the unprecedented phenomenon of the crisis on the island. All these observations set the premises of reexamining the urban development and the respective sustainable housing in a synergy where their characteristics start exchanging positions, merge into each other, contemporarily emerge and vanish, changing from permanent to ephemeral. This fluidity of conditions will attempt to render a future of the built- and unbuilt realm where the main focusing point will be redirected to the human and the social. Weather and social ritual scenographies together with ‘spontaneous urban landscapes’ of ‘momentary relationships’ will suggest a recipe for emerging urban environments and sustainable living. Thus, the paper will aim at opening a discourse on the future of the sustainable living merged in a sustainable urban development in relation to the imminent solution of the division of island, where the issue of property became the main obstacle to be overcome. At the same time, it will attempt to link this approach to the global need for a sustainable evolution of the urban and living realms.

Keywords: social ritual scenographies, spontaneous urban landscapes, substantial hybrid entities, re-introduction of natural phenomena

Procedia PDF Downloads 238
58 Cultural Innovation in Uruena: A Path Against Depopulation

Authors: S. Sansone-Casaburi

Abstract:

The pandemic that the world is going through is causing important changes in the daily life of all cities, which can translate into opportunities to rearrange pending situations. Among others: the town-city relationship and sustainability. On the one hand, the city continues to be the center of attention, and the countryside is assumed as the supplier of food. However, the temporary closure of cities highlighted the importance of the rural environment, and many people are reassessing this context as an alternative for life. Furthermore, the countryside is not simply the home and the center of activity of the people who inhabit it, but rather constitutes the active group of all citizens, both rural and urban. On the other hand, the pandemic is the opportunity to meet sustainable development goals. Sustainable development is understood as the capital to be transferred to future generations made up of three types of wealth: natural capital (environment), human capital (people, relationships, culture), and artificial or built capital, made up of buildings and infrastructure, or by cities and towns. The 'new normal' can mean going back to the countryside, but not to a merely agricultural place but to a sustainable, affordable, and healthy place, which, with the appropriate infrastructures, allows work from a distance, a new post-COVID-19 modality. The contribution of the research is towards the recovery of traditional villages from the perspective of populations that have managed to maintain their vitality with innovative solutions. It is assumed that innovation is a path for the recovery of traditional villages, so we ask: what conditions are necessary for innovation to be successful and sustainable? In the research, several variables were found, among which culture is named, so the objective of this article is to understand Uruena, a town in the province of Valladolid, which with only 182 inhabitants houses five museums and twelve bookstores that make up the first Villa del Libro in Spain. The methodology used is mixed: inductive and deductive and the results were specified in determining the formula of innovative peoples in culture: PIc = Pt + C [E (Aec) + S (pp) + A (T + s + t + enc)]. Where the innovative villages in culture PIc are the result of traditional villages Pt that from a cultural innovation C, integrates into the economic, economic and cultural activities E (Aec); in the social sphere, the public and private actors S (pp); and in the environmental (A), Territory (T), services (s), technology (t) and natural and built spaces (enc). The results of this analysis will focus on determining what makes the structure of innovative peoples sustainable and understanding what variables make up that structure to verify if they can be applied in other contexts and repower abandoned places to provide a solution for people who migrate to this context. That is, learn from what has been done to replicate it in similar cases.

Keywords: culture as innovation, depopulation, sustainability, traditional villages

Procedia PDF Downloads 68
57 Foreseen the Future: Human Factors Integration in European Horizon Projects

Authors: José Manuel Palma, Paula Pereira, Margarida Tomás

Abstract:

Foreseen the future: Human factors integration in European Horizon Projects The development of new technology as artificial intelligence, smart sensing, robotics, cobotics or intelligent machinery must integrate human factors to address the need to optimize systems and processes, thereby contributing to the creation of a safe and accident-free work environment. Human Factors Integration (HFI) consistently pose a challenge for organizations when applied to daily operations. AGILEHAND and FORTIS projects are grounded in the development of cutting-edge technology - industry 4.0 and 5.0. AGILEHAND aims to create advanced technologies for autonomously sort, handle, and package soft and deformable products, whereas FORTIS focuses on developing a comprehensive Human-Robot Interaction (HRI) solution. Both projects employ different approaches to explore HFI. AGILEHAND is mainly empirical, involving a comparison between the current and future work conditions reality, coupled with an understanding of best practices and the enhancement of safety aspects, primarily through management. FORTIS applies HFI throughout the project, developing a human-centric approach that includes understanding human behavior, perceiving activities, and facilitating contextual human-robot information exchange. it intervention is holistic, merging technology with the physical and social contexts, based on a total safety culture model. In AGILEHAND we will identify safety emergent risks, challenges, their causes and how to overcome them by resorting to interviews, questionnaires, literature review and case studies. Findings and results will be presented in “Strategies for Workers’ Skills Development, Health and Safety, Communication and Engagement” Handbook. The FORTIS project will implement continuous monitoring and guidance of activities, with a critical focus on early detection and elimination (or mitigation) of risks associated with the new technology, as well as guidance to adhere correctly with European Union safety and privacy regulations, ensuring HFI, thereby contributing to an optimized safe work environment. To achieve this, we will embed safety by design, and apply questionnaires, perform site visits, provide risk assessments, and closely track progress while suggesting and recommending best practices. The outcomes of these measures will be compiled in the project deliverable titled “Human Safety and Privacy Measures”. These projects received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND) and No 101135707 (FORTIS).

Keywords: human factors integration, automation, digitalization, human robot interaction, industry 4.0 and 5.0

Procedia PDF Downloads 20
56 Geological Characteristics and Hydrocarbon Potential of M’Rar Formation Within NC-210, Atshan Saddle Ghadamis-Murzuq Basins, Libya

Authors: Sadeg M. Ghnia, Mahmud Alghattawi

Abstract:

The NC-210 study area is located in Atshan Saddle between both Ghadamis and Murzuq basins, west Libya. The preserved Palaeozoic successions are predominantly clastics reaching thickness of more than 20,000 ft in northern Ghadamis Basin depocenter. The Carboniferous series consist of interbedded sandstone, siltstone, shale, claystone and minor limestone deposited in a fluctuating shallow marine to brackish lacustrine/fluviatile environment which attain maximum thickness of over 5,000ft in the area of Atshan Saddle and recorded 3,500 ft. in outcrops of Murzuq Basin flanks. The Carboniferous strata was uplifted and eroded during Late Paleozoic and early Mesozoic time in northern Ghadamis Basin and Atshan Saddle. The M'rar Formation age is Tournaisian to Late Serpukhovian based on palynological markers and contains about 12 cycles of sandstone and shale deposited in shallow to outer neritic deltaic settings. The hydrocarbons in the M'rar reservoirs possibly sourced from the Lower Silurian and possibly Frasinian radioactive hot shales. The M'rar Formation lateral, vertical and thickness distribution is possibly influenced by the reactivation of Tumarline Strik-Slip fault and its conjugate faults. A pronounced structural paleohighs and paleolows, trending SE & NW through the Gargaf Saddle, is possibly indicative of the present of two sub-basins in the area of Atshan Saddle. A number of identified seismic reflectors from existing 2D seismic covering Atshan Saddle reflect M’rar deltaic 12 sandstone cycles. M’rar7, M’rar9, M’rar10 and M’rar12 are characterized by high amplitude reflectors, while M’rar2 and M’rar6 are characterized by medium amplitude reflectors. These horizons are productive reservoirs in the study area. Available seismic data in the study area contributed significantly to the identification of M’rar potential traps, which are prominently 3- way dip closure against fault zone. Also seismic data indicates the presence of a significant strikeslip component with the development of flower-structure. The M'rar Formation hydrocarbon discoveries are concentrated mainly in the Atshan Saddle located in southern Ghadamis Basin, Libya and Illizi Basin in southeast of Algeria. Significant additional hydrocarbons may be present in areas adjacent to the Gargaf Uplift, along structural highs and fringing the Hoggar Uplift, providing suitable migration pathways.

Keywords: hydrocarbon potential, stratigraphy, Ghadamis basin, seismic, well data integration

Procedia PDF Downloads 45
55 The Influence of Microsilica on the Cluster Cracks' Geometry of Cement Paste

Authors: Maciej Szeląg

Abstract:

The changing nature of environmental impacts, in which cement composites are operating, are causing in the structure of the material a number of phenomena, which result in volume deformation of the composite. These strains can cause composite cracking. Cracks are merging by propagation or intersect to form a characteristic structure of cracks known as the cluster cracks. This characteristic mesh of cracks is crucial to almost all building materials, which are working in service loads conditions. Particularly dangerous for a cement matrix is a sudden load of elevated temperature – the thermal shock. Resulting in a relatively short period of time a large value of a temperature gradient between the outer surface and the material’s interior can result in cracks formation on the surface and in the volume of the material. In the paper, in order to analyze the geometry of the cluster cracks of the cement pastes, the image analysis tools were used. Tested were 4 series of specimens made of two different Portland cement. In addition, two series include microsilica as a substitute for the 10% of the cement. Within each series, specimens were performed in three w/b indicators (water/binder): 0.4; 0.5; 0.6. The cluster cracks were created by sudden loading the samples by elevated temperature of 250°C. Images of the cracked surfaces were obtained via scanning at 2400 DPI. Digital processing and measurements were performed using ImageJ v. 1.46r software. To describe the structure of the cluster cracks three stereological parameters were proposed: the average cluster area - A ̅, the average length of cluster perimeter - L ̅, and the average opening width of a crack between clusters - I ̅. The aim of the study was to identify and evaluate the relationships between measured stereological parameters, and the compressive strength and the bulk density of the modified cement pastes. The tests of the mechanical and physical feature have been carried out in accordance with EN standards. The curves describing the relationships have been developed using the least squares method, and the quality of the curve fitting to the empirical data was evaluated using three diagnostic statistics: the coefficient of determination – R2, the standard error of estimation - Se, and the coefficient of random variation – W. The use of image analysis allowed for a quantitative description of the cluster cracks’ geometry. Based on the obtained results, it was found a strong correlation between the A ̅ and L ̅ – reflecting the fractal nature of the cluster cracks formation process. It was noted that the compressive strength and the bulk density of cement pastes decrease with an increase in the values of the stereological parameters. It was also found that the main factors, which impact on the cluster cracks’ geometry are the cement particles’ size and the general content of the binder in a volume of the material. The microsilica caused the reduction in the A ̅, L ̅ and I ̅ values compared to the values obtained by the classical cement paste’s samples, which is caused by the pozzolanic properties of the microsilica.

Keywords: cement paste, cluster cracks, elevated temperature, image analysis, microsilica, stereological parameters

Procedia PDF Downloads 224
54 Application of Geosynthetics for the Recovery of Located Road on Geological Failure

Authors: Rideci Farias, Haroldo Paranhos

Abstract:

The present work deals with the use of drainage geo-composite as a deep drainage and geogrid element to reinforce the base of the body of the landfill destined to the road pavement on geological faults in the stretch of the TO-342 Highway, between the cities of Miracema and Miranorte, in the State of Tocantins / TO, Brazil, which for many years was the main link between TO-010 and BR-153, after the city of Palmas, also in the state of Tocantins / TO, Brazil. For this application, geotechnical and geological studies were carried out by means of SPT percussion drilling, drilling and rotary drilling, to understand the problem, identifying the type of faults, filling material and the definition of the water table. According to the geological and geotechnical studies carried out, the area where the route was defined, passes through a zone of longitudinal fault to the runway, with strong breaking / fracturing, with presence of voids, intense alteration and with advanced argilization of the rock and with the filling up parts of the faults by organic and compressible soils leachate from other horizons. This geology presents as a geotechnical aggravating agent a medium of high hydraulic load and very low resistance to penetration. For more than 20 years, the region presented constant excessive deformations in the upper layers of the pavement, which after routine services of regularization, reconformation, re-compaction of the layers and application of the asphalt coating. The faults were quickly propagated to the surface of the asphalt pavement, generating a longitudinal shear, forming steps (unevenness), close to 40 cm, causing numerous accidents and discomfort to the drivers, since the geometric positioning was in a horizontal curve. Several projects were presented to the region's highway department to solve the problem. Due to the need for partial closure of the runway, the short time for execution, the use of geosynthetics was proposed and the most adequate solution for the problem was taken into account the movement of existing geological faults and the position of the water level in relation to several Layers of pavement and failure. In order to avoid any flow of water in the body of the landfill and in the filling material of the faults, a drainage curtain solution was used, carried out at 4.0 meters depth, with drainage geo-composite and as reinforcement element and inhibitor of the possible A geogrid of 200 kN / m of resistance was inserted at the base of the reconstituted landfill. Recent evaluations, after 13 years of application of the solution, show the efficiency of the technique used, supported by the geotechnical studies carried out in the area.

Keywords: geosynthetics, geocomposite, geogrid, road, recovery, geological failure

Procedia PDF Downloads 143
53 Upper Jurassic Foraminiferal Assemblages and Palaeoceanographical Changes in the Central Part of the East European Platform

Authors: Clementine Colpaert, Boris L. Nikitenko

Abstract:

The Upper Jurassic foraminiferal assemblages of the East European Platform have been strongly investigated through the 20th century with biostratigraphical and in smaller degree palaeoecological and palaeobiogeographical purposes. Over the Late Jurassic, the platform was a shallow epicontinental sea that extended from Tethys to the Artic through the Pechora Sea and further toward the northeast in the West Siberian Sea. Foraminiferal assemblages of the Russian Sea were strongly affected by sea-level changes and were controlled by alternated Boreal to Peritethyan influences. The central part of the East European Platform displays very rich and diverse foraminiferal assemblages. Two sections have been analyzed; the Makar'yev Section in the Moscow Depression and the Gorodishi Section in the Yl'yanovsk Depression. Based on the evolution of foraminiferal assemblages, palaeoenvironment has been reconstructed, and sea-level changes have been refined. The aim of this study is to understand palaeoceanographical changes throughout the Oxfordian – Kimmeridgian of the central part of the Russian Sea. The Oxfordian was characterized by a general transgressive event with intermittency of small regressive phases. The platform was connected toward the south with Tethys and Peritethys. During the Middle Oxfordian, opening of a pathway of warmer water from the North-Tethys region to the Boreal Realm favoured the migration of planktonic foraminifera and the appearance of new benthic taxa. It is associated with increased temperature and primary production. During the Late Oxfordian, colder water inputs associated with the microbenthic community crisis may be a response to the closure of this warm-water corridor and the disappearance of planktonic foraminifera. The microbenthic community crisis is probably due to the increased sedimentation rate in the transition from the maximum flooding surface to a second-order regressive event, increasing productivity and inputs of organic matter along with sharp decrease of oxygen into the sediment. It is following during the Early Kimmeridgian by a replacement of foraminiferal assemblages. The almost all Kimmeridgian is characterized by the abundance of many common with Boreal and Subboreal Realm. Connections toward the South began again dominant after a small regressive event recorded during the Late Kimmeridgian and associated with the abundance of many common taxa with Subboreal Realm and Peritethys such as Crimea and Caucasus taxa. Foraminiferal assemblages of the East European Platform are strongly affected by palaeoecological changes and may display a very good model for biofacies typification under Boreal and Subboreal environments. The East European Platform appears to be a key area for the understanding of Upper Jurassic big scale palaeoceanographical changes, being connected with Boreal to Peritethyan basins.

Keywords: foraminifera, palaeoceanography, palaeoecology, upper jurassic

Procedia PDF Downloads 217
52 Effect of Reminiscence Therapy on the Sleep Quality of the Elderly Living in Nursing Homes

Authors: Güler Duru Aşiret

Abstract:

Introduction: Poor sleep quality is a common problem among the older people living in nursing homes. Our study aimed at assessing the effect of individual reminiscence therapy on the sleep quality of the elderly living in nursing homes. Methods: The study had 22 people in the intervention group and 24 people in the control group. The intervention group had reminiscence therapy once a week for 12 weeks in the form of individual sessions of 25-30 minutes. In our study, we first determined the dates suitable for the intervention group and researcher and planned the date and time of individual reminiscence therapies, which would take 12 weeks. While preparing this schedule, we considered subjects’ time schedules for their regular visits to health facilities and the arrival of their visitors. At this stage, the researcher informed the participants that their regular attendance in sessions would affect the intervention outcome. One topic was discussed every week. Weekly topics included: introduction in the first week; childhood and family life, school days, starting work and work life (a day at home for housewives), a fun day out of home, marriage (friendship for the singles), plants and animals they loved, babies and children, food and cooking, holidays and travelling, special days and celebrations, assessment and closure, in the following weeks respectively. The control group had no intervention. Study data was collected by using an introductory information form and the Pittsburgh Sleep Quality Index (PSQI). Results: In our study, participants’ average age was 76.02 ± 7.31. 58.7% of them were male and 84.8% were single. All of them had at least one chronic disease. 76.1% did not need help for performing their daily life activities. The length of stay in the institution was 6.32 ± 3.85 years. According to the participants’ descriptive characteristics, there was no difference between groups. While there was no statistically significant difference between the pretest PSQI median scores (p > 0.05) of both groups, PSQI median score had a statistically significant decrease after 12 weeks of reminiscence therapy (p < 0.05). There was no statistically significant change in the median scores of the subcomponents of sleep latency, sleep duration, sleep efficiency, sleep disturbance and use of sleep medication before and after reminiscence therapy. After the 12-weeks reminiscence therapy, there was a statistically significant change in the median scores for the PSQI subcomponents of subjective sleep quality (p<0.05). Conclusion: Our study found that reminiscence therapy increased the sleep quality of the elderly living in nursing homes. Acknowledgment: This study (project no 2017-037) was supported by the Scientific Research Projects Coordination Unit of Aksaray University. We thank the elderly subjects for their kind participation.

Keywords: nursing, older people, reminiscence therapy, sleep

Procedia PDF Downloads 103
51 Staphylococcus Aureus Septic Arthritis and Necrotizing Fasciitis in a Patient With Undiagnosed Diabetes Mellitus.

Authors: Pedro Batista, André Vinha, Filipe Castelo, Bárbara Costa, Ricardo Sousa, Raquel Ricardo, André Pinto

Abstract:

Background: Septic arthritis is a diagnosis that must be considered in any patient presenting with acute joint swelling and fever. Among the several risk factors for septic arthritis, such as age, rheumatoid arthritis, recent surgery, or skin infection, diabetes mellitus can sometimes be the main risk factor. Staphylococcus aureus is the most common pathogen isolated in septic arthritis; however, it is uncommon in monomicrobial necrotizing fasciitis. Objectives: A case report of concomitant septic arthritis and necrotizing fasciitis in a patient with undiagnosed diabetes based on clinical history. Study Design & Methods: We report a case of a 58-year-old Portuguese previously healthy man who presented to the emergency department with fever and left knee swelling and pain for two days. The blood work revealed ketonemia of 6.7 mmol/L and glycemia of 496 mg/dL. The vital signs were significant for a temperature of 38.5 ºC and 123 bpm of heart rate. The left knee had edema and inflammatory signs. Computed tomography of the left knee showed diffuse edema of the subcutaneous cellular tissue and soft tissue air bubbles. A diagnosis of septic arthritis and necrotising fasciitis was made. He was taken to the operating room for surgical debridement. The samples collected intraoperatively were sent for microbiological analysis, revealing infection by multi-sensitive Staphylococcus aureus. Given this result, the empiric flucloxacillin (500 mg IV) and clindamycin (1000 mg IV) were maintained for 3 weeks. On the seventh day of hospitalization, there was a significant improvement in subcutaneous and musculoskeletal tissues. After two weeks of hospitalization, there was no purulent content and partial closure of the wounds was possible. After 3 weeks, he was switched to oral antibiotics (flucloxacillin 500 mg). A week later, a urinary infection by Pseudomonas aeruginosa was diagnosed and ciprofloxacin 500 mg was administered for 7 days without complications. After 30 days of hospital admission, the patient was discharged home and recovered. Results: The final diagnosis of concomitant septic arthritis and necrotizing fasciitis was made based on the imaging findings, surgical exploration and microbiological tests results. Conclusions: Early antibiotic administration and surgical debridement are key in the management of septic arthritis and necrotizing fasciitis. Furthermore, risk factors control (euglycemic blood glucose levels) must always be taken into account given the crucial role in the patient's recovery.

Keywords: septic arthritis, Necrotizing fasciitis, diabetes, Staphylococcus Aureus

Procedia PDF Downloads 276
50 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images

Authors: Elham Bagheri, Yalda Mohsenzadeh

Abstract:

Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.

Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception

Procedia PDF Downloads 39
49 Test Rig Development for Up-to-Date Experimental Study of Multi-Stage Flash Distillation Process

Authors: Marek Vondra, Petr Bobák

Abstract:

Vacuum evaporation is a reliable and well-proven technology with a wide application range which is frequently used in food, chemical or pharmaceutical industries. Recently, numerous remarkable studies have been carried out to investigate utilization of this technology in the area of wastewater treatment. One of the most successful applications of vacuum evaporation principal is connected with seawater desalination. Since 1950’s, multi-stage flash distillation (MSF) has been the leading technology in this field and it is still irreplaceable in many respects, despite a rapid increase in cheaper reverse-osmosis-based installations in recent decades. MSF plants are conveniently operated in countries with a fluctuating seawater quality and at locations where a sufficient amount of waste heat is available. Nowadays, most of the MSF research is connected with alternative heat sources utilization and with hybridization, i.e. merging of different types of desalination technologies. Some of the studies are concerned with basic principles of the static flash phenomenon, but only few scientists have lately focused on the fundamentals of continuous multi-stage evaporation. Limited measurement possibilities at operating plants and insufficiently equipped experimental facilities may be the reasons. The aim of the presented study was to design, construct and test an up-to-date test rig with an advanced measurement system which will provide real time monitoring options of all the important operational parameters under various conditions. The whole system consists of a conventionally designed MSF unit with 8 evaporation chambers, versatile heating circuit for different kinds of feed water (e.g. seawater, waste water), sophisticated system for acquisition and real-time visualization of all the related quantities (temperature, pressure, flow rate, weight, conductivity, pH, water level, power input), access to a wide spectrum of operational media (salt, fresh and softened water, steam, natural gas, compressed air, electrical energy) and integrated transparent features which enable a direct visual control of selected physical mechanisms (water evaporation in chambers, water level right before brine and distillate pumps). Thanks to the adjustable process parameters, it is possible to operate the test unit at desired operational conditions. This allows researchers to carry out statistical design and analysis of experiments. Valuable results obtained in this manner could be further employed in simulations and process modeling. First experimental tests confirm correctness of the presented approach and promise interesting outputs in the future. The presented experimental apparatus enables flexible and efficient research of the whole MSF process.

Keywords: design of experiment, multi-stage flash distillation, test rig, vacuum evaporation

Procedia PDF Downloads 358