Search results for: objective evaluation
1945 Israeli Households Caring for Children and Adults with Intellectual and Developmental Disabilities: An Explorative Study
Authors: Ayelet Gur
Abstract:
Background: In recent years we are witnessing a welcome trend in which more children/persons with disabilities are living at home with their families and within their communities. This trend is related to various policy innovations as the UN Convention on the Rights of People with Disabilities that reflect a shift from the medical-institutional model to a human rights approach. We also witness the emergence of family centered approaches that perceive the family and not just the individual with the disability as a worthy target of policy planning, implementation and evaluation efforts. The current investigation aims to explore economic, psychological and social factors among households of families of children or adults with intellectual disabilities in Israel and to present policy recommendation. Methods: A national sample of 301 households was recruited through the education and employment settings of persons with intellectual disability. The main caregiver of the person with the disability (a parent) was interviewed. Measurements included the income and expense surveys; assets and debts questionnaire; the questionnaire on resources and stress; the social involvement questionnaire and Personal Wellbeing Index. Results: Findings indicate significant gaps in financial circumstances between households of families of children with intellectual disabilities and households of the general Israeli society. Households of families of children with intellectual disabilities report lower income and higher expenditures and loans than the general society. They experience difficulties in saving and coping with unexpected expenses. Caregivers (the parents) experience high stress, low social participation, low financial support from family, friend and non-governmental organizations and decreased well-being. They are highly dependent on social security allowances which constituted 40% of the household's income. Conclusions: Households' dependency on social security allowances may seem contradictory to the encouragement of persons with intellectual disabilities to favor independent living in light of the human rights approach to disability. New policy should aim at reducing caregivers' stress and enhance their social participation and support, with special emphasis on families of lower socio-economic status. Finally, there is a need to continue monitoring the economic and psycho-social needs of households of families of children with intellectual disabilities and other developmental disabilities.Keywords: disability policy, family policy, intellectual and developmental disabilities, Israel, households study, parents of children with disabilities
Procedia PDF Downloads 1541944 Predicting Daily Patient Hospital Visits Using Machine Learning
Authors: Shreya Goyal
Abstract:
The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.Keywords: machine learning, SVM, HIPAA, data
Procedia PDF Downloads 651943 The Analgesic Impact of Adding Intrathecal Ketamine to Spinal Anaesthesia for Hip or Knee Arthroplasty: A Clinical Audit
Authors: Carl Ashworth, Matthys Campher
Abstract:
Spinal anaesthesia has been identified as the “gold standard” for primary elective total hip and knee arthroplasty, which is most commonly performed using longer-acting local anaesthetics, such as hyperbaric bupivacaine, to prolong the duration of anaesthesia and analgesia suitable for these procedures. Ketamine is known to have local anaesthetic effects with potent analgesic properties and has been evaluated as a sole anaesthetic agent via intrathecal administration; however, the use of intrathecal ketamine as an adjunct to intrathecal hyperbaric bupivacaine, morphine, and fentanyl has not been extensively studied. The objective of this study was to identify the potential analgesic effects of the addition of intrathecal ketamine to spinal anaesthesia and to compare the efficacy and safety of adding intrathecal ketamine to spinal anaesthesia for hip- or knee arthroplasty with spinal anaesthesia for hip- or knee arthroplasty without intrathecal ketamine. The medical records of patients who underwent elective hip- or knee arthroplasty under spinal anaesthesia performed by an individual anaesthetist with either intrathecal hyperbaric bupivacaine, morphine and fentanyl or intrathecal hyperbaric bupivacaine, morphine, fentanyl and ketamine between June 4, 2020, and June 4, 2022, were retrospectively reviewed. These encounters were reviewed and analyzed from a perioperative pain perspective, with the primary outcome measure as the oral morphine equivalent (OME) usage in the 48 hours post-spinal anaesthesia, and secondary outcome measures including time to breakthrough analgesia, self-reported pain scores at rest and during movement at 24 and 48 hours after surgery, adverse effects of analgesia, complications, and length of stay. There were 26 patients identified who underwent TKR between June 4, 2020, and June 4, 2022, and 25 patients who underwent THR with the same conditions. It was identified that patients who underwent traditional spinal anaesthesia with the addition of ketamine for elective hip- or knee arthroplasty had a lower mean total OME in the 48 hours immediately post-spinal anaesthesia yet had a shorter time to breakthrough analgesia administration. The proposed mechanism of action for intrathecal ketamine as an additive to traditional spinal anaesthesia for elective hip- or knee arthroplasty is that it may prolong and attenuate the analgesic effect of traditional spinal anaesthesia. There were no significant differences identified in comparing the efficacy and safety of adding intrathecal ketamine to spinal anaesthesia for hip- or knee arthroplasty with spinal anaesthesia for hip- or knee arthroplasty without intrathecal ketamine.Keywords: anaesthesia, spinal, intra-thecal, ketamine, spinal-morphine, bupivacaine
Procedia PDF Downloads 521942 Uncertainty Quantification of Fuel Compositions on Premixed Bio-Syngas Combustion at High-Pressure
Abstract:
Effect of fuel variabilities on premixed combustion of bio-syngas mixtures is of great importance in bio-syngas utilisation. The uncertainties of concentrations of fuel constituents such as H2, CO and CH4 may lead to unpredictable combustion performances, combustion instabilities and hot spots which may deteriorate and damage the combustion hardware. Numerical modelling and simulations can assist in understanding the behaviour of bio-syngas combustion with pre-defined species concentrations, while the evaluation of variabilities of concentrations is expensive. To be more specific, questions such as ‘what is the burning velocity of bio-syngas at specific equivalence ratio?’ have been answered either experimentally or numerically, while questions such as ‘what is the likelihood of burning velocity when precise concentrations of bio-syngas compositions are unknown, but the concentration ranges are pre-described?’ have not yet been answered. Uncertainty quantification (UQ) methods can be used to tackle such questions and assess the effects of fuel compositions. An efficient probabilistic UQ method based on Polynomial Chaos Expansion (PCE) techniques is employed in this study. The method relies on representing random variables (combustion performances) with orthogonal polynomials such as Legendre or Gaussian polynomials. The constructed PCE via Galerkin Projection provides easy access to global sensitivities such as main, joint and total Sobol indices. In this study, impacts of fuel compositions on combustion (adiabatic flame temperature and laminar flame speed) of bio-syngas fuel mixtures are presented invoking this PCE technique at several equivalence ratios. High-pressure effects on bio-syngas combustion instability are obtained using detailed chemical mechanism - the San Diego Mechanism. Guidance on reducing combustion instability from upstream biomass gasification process is provided by quantifying the significant contributions of composition variations to variance of physicochemical properties of bio-syngas combustion. It was found that flame speed is very sensitive to hydrogen variability in bio-syngas, and reducing hydrogen uncertainty from upstream biomass gasification processes can greatly reduce bio-syngas combustion instability. Variation of methane concentration, although thought to be important, has limited impacts on laminar flame instabilities especially for lean combustion. Further studies on the UQ of percentage concentration of hydrogen in bio-syngas can be conducted to guide the safer use of bio-syngas.Keywords: bio-syngas combustion, clean energy utilisation, fuel variability, PCE, targeted uncertainty reduction, uncertainty quantification
Procedia PDF Downloads 2761941 Trial Version of a Systematic Material Selection Tool in Building Element Design
Authors: Mine Koyaz, M. Cem Altun
Abstract:
Selection of the materials satisfying the expected performances is significantly important for any design. Today, with the constantly evolving and developing technologies, the material options are so wide that the necessity of the use of some support tools in the selection process is arising. Therefore, as a sub process of building element design, a systematic material selection tool is developed, that defines four main steps of the material selection; definition, research, comparison and decision. The main purpose of the tool is being an educational instrument that would show a methodic way of material selection in architectural detailing for the use of architecture students. The tool predefines the possible uses of various material databases and other sources of information on material properties. Hence, it is to be used as a guidance for designers, especially with a limited material knowledge and experience. The material selection tool not only embraces technical properties of materials related with building elements’ functional requirements, but also its sensual properties related with the identity of design and its environmental impacts with respect to the sustainability of the design. The method followed in the development of the tool has two main sections; first the examination and application of the existing methods and second the development of trial versions and their applications. Within the scope of the existing methods; design support tools, methodic approaches for the building element design and material selection process, material properties, material databases, methodic approaches for the decision making process are examined. The existing methods are applied by architecture students and newly graduate architects through different design problems. With respect to the results of these applications, strong and weak sides of the existing material selection tools are presented. A main flow chart of the material selection tool has been developed with the objective to apply the strong aspects of the existing methods and develop their weak sides. Through different stages, a different aspect of the material selection process is investigated and the tool took its final form. Systematic material selection tool, within the building element design process, guides the users with a minimum background information, to practically and accurately determine the ideal material that is to be chosen, satisfying the needs of their design. The tool has a flexible structure that answers different needs of different designs and designers. The trial version issued in this paper shows one of the paths that could be followed and illustrates its application over a design problem.Keywords: architectural education, building element design, material selection tool, systematic approach
Procedia PDF Downloads 3521940 The Influence of Modernity and Globalization upon Language: The Korean Language between Confucianism and Americanization
Authors: Raluca-Ioana Antonescu
Abstract:
The field research of the paper stands at the intersection between Linguistics and Sociology, while the problem of the research is the importance of language in the modernization process and in a globalized society. The research objective is to prove that language is a stimulant for modernity, while it defines the tradition and the culture of a specific society. In order to examine the linguistic change of the Korean language due to the modernity and globalization, the paper tries to answer one main question, What are the changes the Korean language underwent from a traditional version of Korean, towards one influenced by modernity?, and two secondary questions, How are explored in specialized literature the relations between globalization (and modernity) and culture (focusing on language)? and What influences the Korean language? For the purpose of answering the research questions, the paper has the main premise that due to modernity and globalization, the Korean language changed its discourse construction, and two secondary hypothesis, first is that in literature there are not much explored the relations between culture and modernity focusing on the language discourse construction, but more about identity issue and commodification problems, and the second hypothesis is that the Korean language is influenced by traditional values (like Confucianism) while receiving influence also of globalization process (especially from English language). In terms of methodology, the paper will analyze the two main influences upon the Korean language, referring to traditionalism (being defined as the influence of Confucianism) and modernism (as the influence of other countries’ language and culture), and how the Korean language it was constructed and modified due to these two elements. The paper will analyze at what level (grammatical, lexical, etc.) the traditionalism help at the construction of the Korean language, and what are the changes at each level that modernism brought along. As for the results of this research, the influence of modernism changed both lexically and grammatically the Korean language. In 60 years the increase of English influence is astonishing, and this paper shows the main changes the Korean language underwent, like the loanwords (Konglish), but also the reduction of the speech levels and the ease of the register variation use. Therefore the grammatical influence of modernity and globalization could be seen at the reduction of the speech level and register variation, while the lexical change comes with the influence of English language especially, where about 10% of the Korean vocabulary is considered to be loanwords. Also the paper presents the interrelation between traditionalism and modernity, with the example of Konglish, but not only (we can consider also the Korean greetings which are translated by Koreans when they speak in other languages, bringing their cultural characteristics in English discourse construction), which makes the Koreans global, since they speak in an international language, but still local since they cannot get rid completely of their culture.Keywords: Confucianism, globalization, language and linguistic change, modernism, traditionalism
Procedia PDF Downloads 2031939 Phytochemical Screening and Anti-Hypothyroidism Activity of Lepidium sativum Ethanolic Extract
Authors: Reham Hajomer, Ikram Elsiddig, Amna Hamad
Abstract:
Lepidium sativum (Garden Cress) belonging to Brassicaceae family is an annual herb locally known as El-rshad. In Ayurveda it is an important medicinal plant, traditionally used for the treatment of jaundice, liver problems, spleen diseases, gastrointestinal disorders, menstrual problems, fracture, arthritis, inflammatory conditions and for treatment of hypothyroidism. Hypothyroidism is a condition in which the thyroid gland does not produce enough thyroid hormones (Triiodithyronine T3 and Thyroxine T4) which are commonly caused by iodine deficiency. It’s divided into primary and secondary hypothyroidism, the primary caused by failure of thyroid function and secondary due to the failure of adequate thyroid-stimulating hormone (TSH) secretion from the pituitary gland or thyroid -releasing hormone (TRH) from the hypothalamus. The disease is most common in women over age 60. The objective regarding this study is to know whether Lepidium sativum would affect the level of thyroid hormones. The extract was prepared with 96% ethanol using Soxhlet apparatus. The anti-hypothyroidism activity was tested by using thirty male Wistar rats weighing (100-140 g) were used in the experiment. They were grouping into five groups, Group 1: Normal group= Administered only distilled water. Then 10 mg/kg Propylthiouracil was added to the drinking water of all other groups to induce hypothyroidism. Group 2: Negative control without any treatment; Group 3: Test group= treated with oral administration of 500mg/kg extract; Group 4: treated with oral administration of 250mg/kg of the extract; Group 5: Standard group (positive control) = treated with intraperitoneal Levothyroxine. All rats were incubated for 20 days at animal house with room temperature of proper ventilation provided with standard diet. The result show that the Lepidium sativum extract was found to increases the T3 and T4 in the propylthiouracil induced rats with values (0.29 ng/dl T3 and 0.57 U T4) for the 500mg/kg and (0.27 ng/dl T3 and 0.517 U T4) for the 250mg/kg in comparison with standard with values (0.241 ng/dl T3 and 0.516 U T4) so that Lepidium sativum can be stimulatory to thyroid function and possess significant anti-hypothyroidism effect with p-values ranges from (0.000006*-0.893472). In conclusion, from results obtained, Lepidium sativum plant extract was found to posses anti-hypothyroidism effects so its act as an agent that stimulates thyroid hormone secretion.Keywords: anti-hypothyroidism, extract, lepidium, sativum
Procedia PDF Downloads 2051938 Handy EKG: Low-Cost ECG For Primary Care Screening In Developing Countries
Authors: Jhiamluka Zservando Solano Velasquez, Raul Palma, Alejandro Calderon, Servio Paguada, Erick Marin, Kellyn Funes, Hana Sandoval, Oscar Hernandez
Abstract:
Background: Screening cardiac conditions in primary care in developing countries can be challenging, and Honduras is not the exception. One of the main limitations is the underfunding of the Healthcare System in general, causing conventional ECG acquisition to become a secondary priority. Objective: Development of a low-cost ECG to improve screening of arrhythmias in primary care and communication with a specialist in secondary and tertiary care. Methods: Design a portable, pocket-size low-cost 3 lead ECG (Handy EKG). The device is autonomous and has Wi-Fi/Bluetooth connectivity options. A mobile app was designed which can access online servers with machine learning, a subset of artificial intelligence to learn from the data and aid clinicians in their interpretation of readings. Additionally, the device would use the online servers to transfer patient’s data and readings to a specialist in secondary and tertiary care. 50 randomized patients volunteer to participate to test the device. The patients had no previous cardiac-related conditions, and readings were taken. One reading was performed with the conventional ECG and 3 readings with the Handy EKG using different lead positions. This project was possible thanks to the funding provided by the National Autonomous University of Honduras. Results: Preliminary results show that the Handy EKG performs readings of the cardiac activity similar to those of a conventional electrocardiograph in lead I, II, and III depending on the position of the leads at a lower cost. The wave and segment duration, amplitude, and morphology of the readings were similar to the conventional ECG, and interpretation was possible to conclude whether there was an arrhythmia or not. Two cases of prolonged PR segment were found in both ECG device readings. Conclusion: Using a Frugal innovation approach can allow lower income countries to develop innovative medical devices such as the Handy EKG to fulfill unmet needs at lower prices without compromising effectiveness, safety, and quality. The Handy EKG provides a solution for primary care screening at a much lower cost and allows for convenient storage of the readings in online servers where clinical data of patients can then be accessed remotely by Cardiology specialists.Keywords: low-cost hardware, portable electrocardiograph, prototype, remote healthcare
Procedia PDF Downloads 1801937 Peer-to-Peer Mentoring Program for University Students with Disabilities: Self-Report Measures and Academic Outcomes for Program Participants
Authors: Ashleigh Hillier, Jody Goldstein, Lauren Tornatore, Emily Byrne
Abstract:
As individuals with disabilities attend higher education in greater numbers, universities are seeking ways to support the retention and success of these students, beyond the academically based accommodations. Although mentoring programs for this population are being implemented more frequently, there is a lack of empirically validated outcomes which could promote program replication. The research objective of this exploratory study was to examine outcomes for students with disabilities participating in a peer-to-peer mentoring program. Mentees (students with disabilities) met with their mentor (trained upperclassman) once a week for an hour for one semester (14-weeks). Mentors followed a curriculum structured by monthly and weekly goals to guide the sessions. Curriculum topics included socializing on campus, peer pressure, time management, communicating with peers and professors, classroom etiquette, study skills, and seeking help and campus resources. Data was collected over a period of seven semesters resulting in seven separate cohorts (n=46). The impact of the program was measured using quantitative self-report measures as well as qualitative content analysis of focus groups. Academic outcomes (retention, credits earned, and GPA) were compared between those in the mentoring program and a matched group of students registered with Disability Services who did not receive mentoring. In addition, a one-year follow up was conducted to examine the longer term impact of participation. Findings indicated that mentoring had the most impact in knowing how things work at the university, knowing how and where to find opportunities to meet people on campus, and knowing how to access supports. Mentors also provided a supportive relationship to the mentees and helped with social skills. There were no significant differences in academic outcomes between those who were mentored and those in the comparison group. Most mentees reported continuing to benefit from the program one year on, providing support for the retention of knowledge gained and maintenance of positive outcomes over time. In conclusion, while a range of positive outcomes were evidenced, the model was limited in its impact more broadly, particularly with regards to academic success and impacting more complex challenges.Keywords: mentor, outcomes, students with disabilities, university
Procedia PDF Downloads 1441936 Cartography through Picasso’s Eyes
Authors: Desiree Di Marco
Abstract:
The aim of this work is to show through the lens of art first which kind of reality was the one represented through fascist maps, and second to study the impact of the fascist regime’s cartography (FRC) on observers eye’s. In this study, it is assumed that the FRC’s representation of reality was simplified, timeless, and even a-spatial because it underrates the concept of territoriality. Cubism and Picasso’s paintings will be used as counter-examples to mystify fascist cartography’s ideological assumptions. The difference between the gaze of an observer looking at the surface of a fascist map and the gaze of someone observing a Picasso painting is impressive. Because there is always something dark, hidden, behind and inside a map, the world of fascist maps was a world built starting from the observation of a “window” that distorted reality and trapped the eyes of the observers. Moving across the map, they seem as if they were hypnotized. Cartohypnosis is the state in which the observer finds himself enslaved by the attractive force of the map, which uses a sort of “magic” geography, a geography that, by means of symbolic language, never has as its primary objective the attempt to show us reality in its complexity, but that of performing for its audience. Magical geography and hypnotic cartography in fascism blended together, creating an almost mystical, magical relationship that demystified reality to reduce the world to a conquerable space. This reduction offered the observer the possibility of conceiving new dimensions: of the limit, of the boundary, elements with which the subject felt fully involved and in which the aesthetic force of the images demonstrated all its strength. But in the early 20th century, the combination of art and cartography gave rise to new possibilities. Cubism which, more than all the other artistic currents showed us how much the observation of reality from a single point of view falls within dangerous logic, is an example. Cubism was an artistic movement that brought about a profound transformation in pictorial culture. It was not only a revolution of pictorial space, but it was a revolution of our conception of pictorial space. Up until that time, men and women were more inclined to believe in the power of images and their representations. Cubist painters rebelled against this blindness by claiming that art must always offer an alternative. Indeed the contribution of this work is precisely to show how art can be able to provide alternatives to even the most horrible regimes and the most atrocious human misfortunes. It also enriches the field of cartography because it "reassures" it by showing how much good it can be for cartography if also for other disciplines come close. Only in this way researcher can increase the chances for the cartography of a greater diffusion at the academic level.Keywords: cartography, Picasso, fascism, culture
Procedia PDF Downloads 641935 The Safety Related Functions of The Engineered Barriers of the IAEA Borehole Disposal System: The Ghana Pilot Project
Authors: Paul Essel, Eric T. Glover, Gustav Gbeddy, Yaw Adjei-Kyereme, Abdallah M. A. Dawood, Evans M. Ameho, Emmanuel A. Aberikae
Abstract:
Radioactive materials mainly in the form of Sealed Radioactive Sources are being used in various sectors (medicine, agriculture, industry, research, and teaching) for the socio-economic development of Ghana. The use of these beneficial radioactive materials has resulted in an inventory of Disused Sealed Radioactive Sources (DSRS) in storage. Most of the DSRS are legacy/historic sources which cannot be returned to their manufacturer or country of origin. Though small in volume, DSRS can be intensively radioactive and create a significant safety and security liability. They need to be managed in a safe and secure manner in accordance with the fundamental safety objective. The Radioactive Waste Management Center (RWMC) of the Ghana Atomic Energy Commission (GAEC) is currently storing a significant volume of DSRS. The initial activities of the DSRS range from 7.4E+5 Bq to 6.85E+14 Bq. If not managed properly, such DSRS can represent a potential hazard to human health and the environment. Storage is an important interim step, especially for DSRS containing very short-lived radionuclides, which can decay to exemption levels within a few years. Long-term storage, however, is considered an unsustainable option for DSRS with long half-lives hence the need for a disposal facility. The GAEC intends to use the International Atomic Energy Agency’s (IAEA’s) Borehole Disposal System (BDS) to provide a safe, secure, and cost-effective disposal option to dispose of its DSRS in storage. The proposed site for implementation of the BDS is on the GAEC premises at Kwabenya. The site has been characterized to gain a general understanding in terms of its regional setting, its past evolution and likely future natural evolution over the assessment time frame. Due to the long half-lives of some of the radionuclides to be disposed of (Ra-226 with half-life of 1600 years), the engineered barriers of the system must be robust to contain these radionuclides for this long period before they decay to harmless levels. There is the need to assess the safety related functions of the engineered barriers of this disposal system.Keywords: radionuclides, disposal, radioactive waste, engineered barrier
Procedia PDF Downloads 821934 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings
Authors: Oluwatosin Adewale
Abstract:
The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management
Procedia PDF Downloads 981933 Sedimentary, Diagenesis and Evaluation of High Quality Reservoir of Coarse Clastic Rocks in Nearshore Deep Waters in the Dongying Sag; Bohai Bay Basin
Authors: Kouassi Louis Kra
Abstract:
The nearshore deep-water gravity flow deposits in the Northern steep slope of Dongying depression, Bohai Bay basin, have been acknowledged as important reservoirs in the rift lacustrine basin. These deep strata term as coarse clastic sediment, deposit at the root of the slope have complex depositional processes and involve wide diagenetic events which made high-quality reservoir prediction to be complex. Based on the integrated study of seismic interpretation, sedimentary analysis, petrography, cores samples, wireline logging data, 3D seismic and lithological data, the reservoir formation mechanism deciphered. The Geoframe software was used to analyze 3-D seismic data to interpret the stratigraphy and build a sequence stratigraphic framework. Thin section identification, point counts were performed to assess the reservoir characteristics. The software PetroMod 1D of Schlumberger was utilized for the simulation of burial history. CL and SEM analysis were performed to reveal diagenesis sequences. Backscattered electron (BSE) images were recorded for definition of the textural relationships between diagenetic phases. The result showed that the nearshore steep slope deposits mainly consist of conglomerate, gravel sandstone, pebbly sandstone and fine sandstone interbedded with mudstone. The reservoir is characterized by low-porosity and ultra-low permeability. The diagenesis reactions include compaction, precipitation of calcite, dolomite, kaolinite, quartz cement and dissolution of feldspars and rock fragment. The main types of reservoir space are primary intergranular pores, residual intergranular pores, intergranular dissolved pores, intergranular dissolved pores, and fractures. There are three obvious anomalous high-porosity zones in the reservoir. Overpressure and early hydrocarbon filling are the main reason for abnormal secondary pores development. Sedimentary facies control the formation of high-quality reservoir, oil and gas filling preserves secondary pores from late carbonate cementation.Keywords: Bohai Bay, Dongying Sag, deep strata, formation mechanism, high-quality reservoir
Procedia PDF Downloads 1351932 Evaluation of the Effect of Magnetic Field on Fibroblast Attachment in Contact with PHB/Iron Oxide Nanocomposite
Authors: Shokooh Moghadam, Mohammad Taghi Khorasani, Sajjad Seifi Mofarah, M. Daliri
Abstract:
Through the recent two decades, the use of magnetic-property materials with the aim of target cell’s separation and eventually cancer treatment has incredibly increased. Numerous factors can alter the efficacy of this method on curing. In this project, the effect of magnetic field on adhesion of PDL and L929 cells on nanocomposite of iron oxide/PHB with different density of iron oxides (1%, 2.5%, 5%) has been studied. The nanocamposite mentioned includes a polymeric film of poly hydroxyl butyrate and γ-Fe2O3 particles with the average size of 25 nanometer dispersed in it and during this process, poly vinyl alcohol with 98% hydrolyzed and 78000 molecular weight was used as an emulsion to achieve uniform distribution. In order to get the homogenous film, the solution of PHB and iron oxide nanoparticles were put in a dry freezer and in liquid nitrogen, which resulted in a uniform porous scaffold and for removing porosities a 100◦C press was used. After the synthesis of a desirable nanocomposite film, many different tests were performed, First, the particles size and their distribution in the film were evaluated by transmission electron microscopy (TEM) and even FTIR analysis and DMTA test were run in order to observe and accredit the chemical connections and mechanical properties of nanocomposites respectively. By comparing the graphs of case and control samples, it was established that adding nano particles caused an increase in crystallization temperature and the more density of γ-Fe2O3 lead to more Tg (glass temperature). Furthermore, its dispersion range and dumping property of samples were raised up. Moreover, the toxicity, morphologic changes and adhesion of fibroblast and cancer cells were evaluated by a variety of tests. All samples were grown in different density and in contact with cells for 24 and 48 hours within the magnetic fields of 2×10^-3 Tesla. After 48 hours, the samples were photographed with an optic and SEM and no sign of toxicity was traced. The number of cancer cells in the case of sample group was fairly more than the control group. However, there are many gaps and unclear aspects to use magnetic field and their effects in cancer and all diseases treatments yet to be discovered, not to neglect that there have been prominent step on this way in these recent years and we hope this project can be at least a minimum movement in this issue.Keywords: nanocomposite, cell attachment, magnetic field, cytotoxicity
Procedia PDF Downloads 2591931 Modeling the Effects of Leachate-Impacted Groundwater on the Water Quality of a Large Tidal River
Authors: Emery Coppola Jr., Marwan Sadat, Il Kim, Diane Trube, Richard Kurisko
Abstract:
Contamination sites like landfills often pose significant risks to receptors like surface water bodies. Surface water bodies are often a source of recreation, including fishing and swimming, which not only enhances their value but also serves as a direct exposure pathway to humans, increasing their need for protection from water quality degradation. In this paper, a case study presents the potential effects of leachate-impacted groundwater from a large closed sanitary landfill on the surface water quality of the nearby Raritan River, situated in New Jersey. The study, performed over a two year period, included in-depth field evaluation of both the groundwater and surface water systems, and was supplemented by computer modeling. The analysis required delineation of a representative average daily groundwater discharge from the Landfill shoreline into the large, highly tidal Raritan River, with a corresponding estimate of daily mass loading of potential contaminants of concern. The average daily groundwater discharge into the river was estimated from a high-resolution water level study and a 24-hour constant-rate aquifer pumping test. The significant tidal effects induced on groundwater levels during the aquifer pumping test were filtered out using an advanced algorithm, from which aquifer parameter values were estimated using conventional curve match techniques. The estimated hydraulic conductivity values obtained from individual observation wells closely agree with tidally-derived values for the same wells. Numerous models were developed and used to simulate groundwater contaminant transport and surface water quality impacts. MODFLOW with MT3DMS was used to simulate the transport of potential contaminants of concern from the down-gradient edge of the Landfill to the Raritan River shoreline. A surface water dispersion model based upon a bathymetric and flow study of the river was used to simulate the contaminant concentrations over space within the river. The modeling results helped demonstrate that because of natural attenuation, the Landfill does not have a measurable impact on the river, which was confirmed by an extensive surface water quality study.Keywords: groundwater flow and contaminant transport modeling, groundwater/surface water interaction, landfill leachate, surface water quality modeling
Procedia PDF Downloads 2611930 Spatial Analysis as a Tool to Assess Risk Management in Peru
Authors: Josué Alfredo Tomas Machaca Fajardo, Jhon Elvis Chahua Janampa, Pedro Rau Lavado
Abstract:
A flood vulnerability index was developed for the Piura River watershed in northern Peru using Principal Component Analysis (PCA) to assess flood risk. The official methodology to assess risk from natural hazards in Peru was introduced in 1980 and proved effective for aiding complex decision-making. This method relies in part on decision-makers defining subjective correlations between variables to identify high-risk areas. While risk identification and ensuing response activities benefit from a qualitative understanding of influences, this method does not take advantage of the advent of national and international data collection efforts, which can supplement our understanding of risk. Furthermore, this method does not take advantage of broadly applied statistical methods such as PCA, which highlight central indicators of vulnerability. Nowadays, information processing is much faster and allows for more objective decision-making tools, such as PCA. The approach presented here develops a tool to improve the current flood risk assessment in the Peruvian basin. Hence, the spatial analysis of the census and other datasets provides a better understanding of the current land occupation and a basin-wide distribution of services and human populations, a necessary step toward ultimately reducing flood risk in Peru. PCA allows the simplification of a large number of variables into a few factors regarding social, economic, physical and environmental dimensions of vulnerability. There is a correlation between the location of people and the water availability mainly found in rivers. For this reason, a comprehensive vision of the population location around the river basin is necessary to establish flood prevention policies. The grouping of 5x5 km gridded areas allows the spatial analysis of flood risk rather than assessing political divisions of the territory. The index was applied to the Peruvian region of Piura, where several flood events occurred in recent past years, being one of the most affected regions during the ENSO events in Peru. The analysis evidenced inequalities for the access to basic services, such as water, electricity, internet and sewage, between rural and urban areas.Keywords: assess risk, flood risk, indicators of vulnerability, principal component analysis
Procedia PDF Downloads 1861929 Airborne CO₂ Lidar Measurements for Atmospheric Carbon and Transport: America (ACT-America) Project and Active Sensing of CO₂ Emissions over Nights, Days, and Seasons 2017-2018 Field Campaigns
Authors: Joel F. Campbell, Bing Lin, Michael Obland, Susan Kooi, Tai-Fang Fan, Byron Meadows, Edward Browell, Wayne Erxleben, Doug McGregor, Jeremy Dobler, Sandip Pal, Christopher O'Dell, Ken Davis
Abstract:
The Active Sensing of CO₂ Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) is a NASA Langley Research Center instrument funded by NASA’s Science Mission Directorate that seeks to advance technologies critical to measuring atmospheric column carbon dioxide (CO₂ ) mixing ratios in support of the NASA ASCENDS mission. The ACES instrument, an Intensity-Modulated Continuous-Wave (IM-CW) lidar, was designed for high-altitude aircraft operations and can be directly applied to space instrumentation to meet the ASCENDS mission requirements. The ACES design demonstrates advanced technologies critical for developing an airborne simulator and spaceborne instrument with lower platform consumption of size, mass, and power, and with improved performance. The Atmospheric Carbon and Transport – America (ACT-America) is an Earth Venture Suborbital -2 (EVS-2) mission sponsored by the Earth Science Division of NASA’s Science Mission Directorate. A major objective is to enhance knowledge of the sources/sinks and transport of atmospheric CO₂ through the application of remote and in situ airborne measurements of CO₂ and other atmospheric properties on spatial and temporal scales. ACT-America consists of five campaigns to measure regional carbon and evaluate transport under various meteorological conditions in three regional areas of the Continental United States. Regional CO₂ distributions of the lower atmosphere were observed from the C-130 aircraft by the Harris Corp. Multi-Frequency Fiber Laser Lidar (MFLL) and the ACES lidar. The airborne lidars provide unique data that complement the more traditional in situ sensors. This presentation shows the applications of CO₂ lidars in support of these science needs.Keywords: CO₂ measurement, IMCW, CW lidar, laser spectroscopy
Procedia PDF Downloads 1621928 Impact of Electric Vehicles on Energy Consumption and Environment
Authors: Amela Ajanovic, Reinhard Haas
Abstract:
Electric vehicles (EVs) are considered as an important means to cope with current environmental problems in transport. However, their high capital costs and limited driving ranges state major barriers to a broader market penetration. The core objective of this paper is to investigate the future market prospects of various types of EVs from an economic and ecological point of view. Our method of approach is based on the calculation of total cost of ownership of EVs in comparison to conventional cars and a life-cycle approach to assess the environmental benignity. The most crucial parameters in this context are km driven per year, depreciation time of the car and interest rate. The analysis of future prospects it is based on technological learning regarding investment costs of batteries. The major results are the major disadvantages of battery electric vehicles (BEVs) are the high capital costs, mainly due to the battery, and a low driving range in comparison to conventional vehicles. These problems could be reduced with plug-in hybrids (PHEV) and range extenders (REXs). However, these technologies have lower CO₂ emissions in the whole energy supply chain than conventional vehicles, but unlike BEV they are not zero-emission vehicles at the point of use. The number of km driven has a higher impact on total mobility costs than the learning rate. Hence, the use of EVs as taxis and in car-sharing leads to the best economic performance. The most popular EVs are currently full hybrid EVs. They have only slightly higher costs and similar operating ranges as conventional vehicles. But since they are dependent on fossil fuels, they can only be seen as energy efficiency measure. However, they can serve as a bridging technology, as long as BEVs and fuel cell vehicle do not gain high popularity, and together with PHEVs and REX contribute to faster technological learning and reduction in battery costs. Regarding the promotion of EVs, the best results could be reached with a combination of monetary and non-monetary incentives, as in Norway for example. The major conclusion is that to harvest the full environmental benefits of EVs a very important aspect is the introduction of CO₂-based fuel taxes. This should ensure that the electricity for EVs is generated from renewable energy sources; otherwise, total CO₂ emissions are likely higher than those of conventional cars.Keywords: costs, mobility, policy, sustainability,
Procedia PDF Downloads 2261927 Using Corpora in Semantic Studies of English Adjectives
Authors: Oxana Lukoshus
Abstract:
The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies
Procedia PDF Downloads 3141926 Hepatitis B, Hepatitis C and HIV Infections and Associated Risk Factors among Substance Abusers in Mekelle Substance Users Treatment and Rehabilitation Centers, Tigrai, Northern Ethiopia
Authors: Tadele Araya, Tsehaye Asmelash, Girmatsion Fiseha
Abstract:
Background: Hepatitis B virus (HBV), Hepatitis C virus (HCV) and Human Immunodeficiency Virus (HIV) constitute serious healthcare problems worldwide. Blood-borne pathogens HBV, HCV and HIV are commonly associated with infections among substance or Injection Drug Users (IDUs). The objective of this study was to determine the prevalence of HBV, HCV, and HIV infections among substance users in Mekelle Substance users Treatment and Rehabilitation Centers. Methods: A cross-sectional study design was used from Dec 2020 to Sep / 2021 to conduct the study. A total of 600 substance users were included. Data regarding the socio-demographic, clinical and sexual behaviors of the substance users were collected using a structured questionnaire. For laboratory analysis, 5-10 ml of venous blood was taken from the substance users. The laboratory analysis was performed by Enzyme-Linked Immunosorbent Assay (ELISA) at Mekelle University, Department of Medical Microbiology and Immunology Research Laboratory. The Data was analyzed using SPSS and Epi-data. The association of variables with HBV, HCV and HIV infections was determined using multivariate analysis and a P value < 0.05 was considered statistically significant. Result: The overall prevalence rate of HBV, HCV and HIV infections were 10%, 6.6%, and 7.5%, respectively. The mean age of the study participants was 28.12 ± 6.9. A higher prevalence of HBV infection was seen in participants who were users of drug injections and in those who were infected with HIV. HCV was comparatively higher in those who had a previous history of unsafe surgical procedures than their counterparts. Homeless participants were highly exposed to HCV and HIV infections than their counterparts. The HBV/HIV Co-infection prevalence was 3.5%. Those doing unprotected sexual practices [P= 0.03], Injection Drug users [P= 0.03], those who had an HBV-infected person in their family [P=0.02], infected with HIV [P= 0.025] were statistically associated with HBV infection. HCV was significantly associated with Substance users and previous history of unsafe surgical procedures [p=0.03, p=0.04), respectively. HIV was significantly associated with unprotected sexual practices and being homeless [p=0.045, p=0.05) respectively. Conclusion-The highly prevalent viral infection was HBV compared to others. There was a High prevalence of HBV/HIV co-infection. The presence of HBV-infected persons in a family, unprotected sexual practices and sharing of needles for drug injection were the risk factors associated with HBV, HIV, and HCV. Continuous health education and screening of the viral infection coupled with medical and psychological treatment is mandatory for the prevention and control of the infections.Keywords: hepatitis b virus, hepatitis c virus, HIV, substance users
Procedia PDF Downloads 851925 Quantum Cum Synaptic-Neuronal Paradigm and Schema for Human Speech Output and Autism
Authors: Gobinathan Devathasan, Kezia Devathasan
Abstract:
Objective: To improve the current modified Broca-Wernicke-Lichtheim-Kussmaul speech schema and provide insight into autism. Methods: We reviewed the pertinent literature. Current findings, involving Brodmann areas 22, 46, 9,44,45,6,4 are based on neuropathology and functional MRI studies. However, in primary autism, there is no lucid explanation and changes described, whether neuropathology or functional MRI, appear consequential. Findings: We forward an enhanced model which may explain the enigma related to autism. Vowel output is subcortical and does need cortical representation whereas consonant speech is cortical in origin. Left lateralization is needed to commence the circuitry spin as our life have evolved with L-amino acids and left spin of electrons. A fundamental species difference is we are capable of three syllable-consonants and bi-syllable expression whereas cetaceans and songbirds are confined to single or dual consonants. The 4 key sites for speech are superior auditory cortex, Broca’s two areas, and the supplementary motor cortex. Using the Argand’s diagram and Reimann’s projection, we theorize that the Euclidean three dimensional synaptic neuronal circuits of speech are quantized to coherent waves, and then decoherence takes place at area 6 (spherical representation). In this quantum state complex, 3-consonant languages are instantaneously integrated and multiple languages can be learned, verbalized and differentiated. Conclusion: We postulate that evolutionary human speech is elevated to quantum interaction unlike cetaceans and birds to achieve the three consonants/bi-syllable speech. In classical primary autism, the sudden speech switches off and on noted in several cases could now be explained not by any anatomical lesion but failure of coherence. Area 6 projects directly into prefrontal saccadic area (8); and this further explains the second primary feature in autism: lack of eye contact. The third feature which is repetitive finger gestures, located adjacent to the speech/motor areas, are actual attempts to communicate with the autistic child akin to sign language for the deaf.Keywords: quantum neuronal paradigm, cetaceans and human speech, autism and rapid magnetic stimulation, coherence and decoherence of speech
Procedia PDF Downloads 1951924 Influential Factors Impacting the Utilization of Pain Assessment Tools among Hospitalized Elderly Patients in Taiwan
Authors: Huei Jiun Chen, Hui Mei Huan
Abstract:
Introduction: Pain is an unpleasant experience for hospitalized patients that impacts both their physical and mental well-being. It is important to select appropriate pain assessment tools to ensure effective pain management. Therefore, it is suggested to use Verbal Rating Scale (VRS) instead for better assessment. The Wong-Baker FACES Pain Rating Scale(WBS) is a widely used pain assessment tool in Taiwan to help individuals communicate the intensity of their pain. However, in clinical practice, even when using various assessment tools to evaluate pain, Numeric Rating Scale-11 (NRS-11) is still commonly utilized to quantify the intensity of pain. The correlation between NRS and other pain assessment tools has not been extensively explored in Taiwan. Additionally, the influence of gender and education level on pain assessment among elderly individuals has not been extensively studied in Taiwan. The aim of this study is to investigate the correlation between pain assessment scales (NRS-11, VRS, WBS) in assessing pain intensity among elderly inpatients. The secondary objective of this study is to examine how gender and education level influence pain assessment among individuals, as well as to explore their preferences regarding pain assessment tools. Method: In this study, a questionnaire survey and purposive sampling were employed to recruit participants from a medical center located in central Taiwan. Participants were requested to assess their pain intensity in the past 24 hours using NRS-11, VRS, and WBS. Additionally, the study investigated their preferences for pain assessment tools. Result: A total of 252 participants were included in this study, with a mean age of 71.1 years (SD=6.2). Of these participants, 135 were male (53.6%), and 44.4% had a primary level or below education. Participants were asked to use NRS-11, VRS, and WBS to assess their current, maximum, and minimum pain intensity experienced in the past 24 hours. The findings indicated a significant correlation (p< .01) among all three pain assessment tools. No significant differences were observed in gender across the three pain assessment scales. For severe pain, there were significant differences in self-rated pain scales among the elderly participants with different education levels (F=3.08, p< .01; X²=17.25, X²=17.21, p< .01), but there were no significant differences observed for mild pain. Regarding preferences for pain assessment tools, 158 participants (62.7%) favored VRS, followed by WBS; gender and education level had no influence on their preferences. Conclusion: Most elderly participants prefer using VRS (Verbal Rating Scale) to self-reported their pain. The reason for this preference may be attributed to the verbal nature of VRS, as it is simple and easy to understand. Furthermore, it could be associated with the level of education among the elderly participants. The pain assessment using VRS demonstrated a significant correlation with NRS-11 and WBS, and gender was not found to have any influence on these assessment. Further research is needed to explore the effect of different education levels on self-reported pain intensity among elderly people in Taiwan.Keywords: pain assessment, elderly, gender, education
Procedia PDF Downloads 761923 A Comparative Study on the Development of Webquest and Online Treasure Hunt as Instructional Materials in Teaching Motion in One Dimension for Grade VII Students
Authors: Mark Anthony Burdeos, Kara Ella Catoto, Alraine Pauyon, Elesar Malicoban
Abstract:
This study sought to develop, validate, and implement the WebQuest and Online Treasure Hunt as instructional materials in teaching Motion in One Dimension for Grade 7 students and to determine its effects on the students’ conceptual learning, performance and attitude towards Physics. In the development stage, several steps were taken, such as the actual planning and developing the WebQuest and Online Treasure Hunt and making the lesson plan and achievement test. The content and the ICT(Information Communications Technology) effect of the developed instructional materials were evaluated by the Content and ICT experts using adapted evaluation forms. During the implementation, pretest and posttest were administered to determine students’ performance, and pre-attitude and post-attitude tests to investigate students’ attitudes towards Physics before and after the WebQuest and Online Treasure Hunt activity. The developed WebQuest and Online Treasure Hunt passed the validation of Content experts and ICT experts. Students acquired more knowledge on Motion in One Dimension and gained a positive attitude towards Physics after the utilization of WebQuest and Online Treasure Hunt, evidenced significantly higher scores in posttest compared to pretest and higher ratings in post-attitude than pre-attitude. The developed WebQuest and Online Treasure Hunt were proven good in quality and effective materials in teaching Motion in One Dimension and developing a positive attitude towards Physics. However, students performed better in the pretest and posttest and rated higher in the pre-attitude and post-attitude tests in the WebQuest than in the Online Treasure Hunt. This study would provide significant learning experiences to the students that would be useful in building their knowledge, in understanding concepts in a most understandable way, in exercising to use their higher-order thinking skills, and in utilizing their capabilities and abilities to relate Physics topics to real-life situations thereby, students can have in-depth learning about Motion in One Dimension. This study would help teachers to enhance the teaching strategies as the two instructional materials provide interesting, engaging, and innovative teaching-learning experiences for the learners, which are helpful in increasing the level of their motivation and participation in learning Physics. In addition, it would provide information as a reference in using technology in the classroom and to determine which of the two instructional materials, WebQuest and Online Treasure Hunt, is suitable for the teaching-learning process in Motion in One Dimension.Keywords: ICT integration, motion in one dimension, online treasure hunt, Webquest
Procedia PDF Downloads 1761922 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point
Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee
Abstract:
Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis
Procedia PDF Downloads 3431921 An Exploratory Study of Vocational High School Students’ Needs in Learning English
Authors: Yi-Hsuan Gloria Lo
Abstract:
The educational objective of vocational high schools (VHSs) is to equip VHS students with practical skills and knowledge that can be applied in the job-related market. However, with the increasing number of technological universities over the past two decades, the majority of VHS students have chosen to receive higher education rather than enter the job market. VHS English education has been confronting a dilemma: Should an English for specific purposes (ESP) approach, which aligns with the educational goal of VHS education, be taken or should an English for general purposes (EGP) approach, which prepares VHS students for advanced studies in universities, be followed? While ESP theorists proposed that that ESP can be taught to secondary learners, little was known about VHS students’ perspective on this ESP-versus-EGP dilemma. Scant research has investigated different facets of students’ needs (necessities, wants, and lacks) for both ESP and EGP in terms of the four language skills and the factors that contribute to any differences. To address the gap in the literature, 100 VHS students responded to statements related to their necessities, wants, and lacks in learning ESP and EGP on a 6-point Likert scale. Six VHS students were interviewed to tap into the reasons for different facets of the needs for learning EGP and ESP. The statistical analysis indicates that at this stage of learning English, VHS subjects believed that EGP was more necessary than ESP; EGP was more desirable than ESP. However, they reported that they were more lacking in ESP than in EGP learning. Regarding EGP, the results show that the VHS subjects rated speaking as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference was found between perceived learning necessities and lacks and between perceived wants and lacks. No statistical difference was found between necessities and wants. In the aspect of ESP, the results indicate that the VHS subjects marked reading as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference exists between their perceived necessities and lacks and between their wants and lacks. However, there is no statistically significant difference between their perceived lacks and wants. Despite the lack of a significant difference between learning necessities and wants, the qualitative interview data reveal that the reasons for their perceived necessities and wants were different. The findings of the study confirm previous research that demonstrates that ‘needs’ is a multiple and conflicting construct. What VHS students felt most lacking was not necessarily what they believed they should learn or would like to learn. Although no statistical difference was found, different reasons were attributed to their perceived necessities and wants. Both theoretical and practical implications have been drawn and discussed for ESP research in general and teaching ESP in VHSs in particular.Keywords: vocational high schools (VHSs), English for General Purposes (EGP), English for Specific Purposes (ESP), needs analysis
Procedia PDF Downloads 1711920 Design Evaluation Tool for Small Wind Turbine Systems Based on the Simple Load Model
Authors: Jihane Bouabid
Abstract:
The urgency to transition towards sustainable energy sources has revealed itself imperative. Today, in the 21st Century, the intellectual society have imposed technological advancements and improvements, and anticipates expeditious outcomes as an integral component of its relentless pursuit of an elevated standard of living. As a part of empowering human development, driving economic growth and meeting social needs, the access to energy services has become a necessity. As a part of these improvements, we are introducing the project "Mywindturbine" - an interactive web user interface for design and analysis in the field of wind energy, with a particular adherence to the IEC (International Electrotechnical Commission) standard 61400-2 "Wind turbines – Part 2: Design requirements for small wind turbines". Wind turbines play a pivotal role in Morocco's renewable energy strategy, leveraging the nation's abundant wind resources. The IEC 61400-2 standard ensures the safety and design integrity of small wind turbines deployed in Morocco, providing guidelines for performance and safety protocols. The conformity with this standard ensures turbine reliability, facilitates standards alignment, and accelerates the integration of wind energy into Morocco's energy landscape. The aim of the GUI (Graphical User Interface) for engineers and professionals from the field of wind energy systems who would like to design a small wind turbine system following the safety requirements of the international standards IEC 61400-2. The interface provides an easy way to analyze the structure of the turbine machine under normal and extreme load conditions based on the specific inputs provided by the user. The platform introduces an overview to sustainability and renewable energy, with a focus on wind turbines. It features a cross-examination of the input parameters provided from the user for the SLM (Simple Load Model) of small wind turbines, and results in an analysis according to the IEC 61400-2 standard. The analysis of the simple load model encompasses calculations for fatigue loads on blades and rotor shaft, yaw error load on blades, etc. for the small wind turbine performance. Through its structured framework and adherence to the IEC standard, "Mywindturbine" aims to empower professionals, engineers, and intellectuals with the knowledge and tools necessary to contribute towards a sustainable energy future.Keywords: small wind turbine, IEC 61400-2 standard, user interface., simple load model
Procedia PDF Downloads 631919 How Technology Can Help Teachers in Reflective Practice
Authors: Ambika Perisamy, Asyriawati binte Mohd Hamzah
Abstract:
The focus of this presentation is to discuss teacher professional development (TPD) through the use of technology. TPD is necessary to prepare teachers for future challenges they will face throughout their careers and to develop new skills and good teaching practices. We will also be discussing current issues in embracing technology in the field of early childhood education and the impact on the professional development of teachers. Participants will also learn to apply teaching and learning practices through the use of technology. One major objective of this presentation is to coherently fuse practical, technology and theoretical content. The process begins by concretizing a set of preconceived ideas which need to be joined with theoretical justifications found in the literature. Technology can make observations fairer and more reliable, easier to implement, and more preferable to teachers and principals. Technology will also help principals to improve classroom observations of teachers and ultimately improve teachers’ continuous professional development. Video technology allows the early childhood teachers to record and keep the recorded video for reflection at any time. This will also provide opportunities for her to share with her principals for professional dialogues and continuous professional development plans. A total of 10 early childhood teachers and 4 principals were involved in these efforts which identified and analyze the gaps in the quality of classroom observations and its co relation to developing teachers as reflective practitioners. The methodology used involves active exploration with video technology recordings, conversations, interviews and authentic teacher child interactions which forms the key thrust in improving teaching and learning practice. A qualitative analysis of photographs, videos, transcripts which illustrates teacher’s reflections and classroom observation checklists before and after the use of video technology were adopted. Arguably, although PD support can be magnanimously strong, if teachers could not connect or create meaning out of the opportunities made available to them, they may remain passive or uninvolved. Therefore, teachers must see the value of applying new ideas such as technology and approaches to practice while creating personal meaning out of professional development. These video recordings are transferable, can be shared and edited through social media, emails and common storage between teachers and principals. To conclude the importance of reflective practice among early childhood teachers and addressing the concerns raised before and after the use of video technology, teachers and principals shared the feasibility, practical and relevance use of video technology.Keywords: early childhood education, reflective, improve teaching and learning, technology
Procedia PDF Downloads 5021918 Hidro-IA: An Artificial Intelligent Tool Applied to Optimize the Operation Planning of Hydrothermal Systems with Historical Streamflow
Authors: Thiago Ribeiro de Alencar, Jacyro Gramulia Junior, Patricia Teixeira Leite
Abstract:
The area of the electricity sector that deals with energy needs by the hydroelectric in a coordinated manner is called Operation Planning of Hydrothermal Power Systems (OPHPS). The purpose of this is to find a political operative to provide electrical power to the system in a given period, with reliability and minimal cost. Therefore, it is necessary to determine an optimal schedule of generation for each hydroelectric, each range, so that the system meets the demand reliably, avoiding rationing in years of severe drought, and that minimizes the expected cost of operation during the planning, defining an appropriate strategy for thermal complementation. Several optimization algorithms specifically applied to this problem have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. An alternative to these challenges is the development of techniques for simulation optimization and more sophisticated and reliable, it can assist the planning of the operation. Thus, this paper presents the development of a computational tool, namely Hydro-IA for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique is Genetic Algorithm (GA) and programming language is Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The results with the Genetic Algorithms were compared with the optimization technique nonlinear programming (NLP). Tests were conducted with seven hydroelectric plants interconnected hydraulically with historical stream flow from 1953 to 1955. The results of comparison between the GA and NLP techniques shows that the cost of operating the GA becomes increasingly smaller than the NLP when the number of hydroelectric plants interconnected increases. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.Keywords: energy, optimization, hydrothermal power systems, artificial intelligence and genetic algorithms
Procedia PDF Downloads 4201917 Impact of Different Rearing Diets on the Performance of Adult Mealworms Tenebrio molitor
Authors: Caroline Provost, Francois Dumont
Abstract:
Production of insects for human and animal consumption is an increasingly important activity in Canada. Protein production is more efficient and less harmful to the environment using insect rearing compared to the impact of traditional livestock, poultry and fish farms. Insects are rich in essential amino acids, essential fatty acids and trace elements. Thus, insect-based products could be used as a food supplement for livestock and domestic animals and may even find their way into the diets of high performing athletes or fine dining. Nevertheless, several parameters remain to be determined to ensure efficient and profitable production that meet the potential of these sectors. This project proposes to improve the production processes, rearing diets and processing methods for three species with valuable gastronomic and nutritional potential: the common mealworms (Tenebrio molitor), the small mealworm (Alphitobius diaperinus), and the giant mealworm (Zophobas morio). The general objective of the project is to acquire specific knowledge for mass rearing of insects dedicated to animal and human consumption in order to respond to current market opportunities and meet a growing demand for these products. Mass rearing of the three species of mealworm was produced to provide the individuals needed for the experiments. Mealworms eat flour from different cereals (e.g. wheat, barley, buckwheat). These cereals vary in their composition (protein, carbohydrates, fiber, vitamins, antioxidant, etc.), but also in their purchase cost. Seven different diets were compared to optimize the yield of the rearing. Diets were composed of cereal flour (e.g. wheat, barley) and were either mixed or left alone. Female fecundity, larvae mortality and growing curves were observed. Some flour diets have positive effects on female fecundity and larvae performance while each mealworm was found to have specific diet requirements. Trade-offs between mealworm performance and costs need to be considered. Experiments on the effect of flour composition on several parameters related to performance and nutritional and gastronomic value led to the identification of a more appropriate diet for each mealworm.Keywords: mass rearing, mealworm, human consumption, diet
Procedia PDF Downloads 1471916 Logistical Optimization of Nuclear Waste Flows during Decommissioning
Authors: G. Dottavio, M. F. Andrade, F. Renard, V. Cheutet, A.-L. Ladier, S. Vercraene, P. Hoang, S. Briet, R. Dachicourt, Y. Baizet
Abstract:
An important number of technological equipment and high-skilled workers over long periods of time have to be mobilized during nuclear decommissioning processes. The related operations generate complex flows of waste and high inventory levels, associated to information flows of heterogeneous types. Taking into account that more than 10 decommissioning operations are on-going in France and about 50 are expected toward 2025: A big challenge is addressed today. The management of decommissioning and dismantling of nuclear installations represents an important part of the nuclear-based energy lifecycle, since it has an environmental impact as well as an important influence on the electricity cost and therefore the price for end-users. Bringing new technologies and new solutions into decommissioning methodologies is thus mandatory to improve the quality, cost and delay efficiency of these operations. The purpose of our project is to improve decommissioning management efficiency by developing a decision-support framework dedicated to plan nuclear facility decommissioning operations and to optimize waste evacuation by means of a logistic approach. The target is to create an easy-to-handle tool capable of i) predicting waste flows and proposing the best decommissioning logistics scenario and ii) managing information during all the steps of the process and following the progress: planning, resources, delays, authorizations, saturation zones, waste volume, etc. In this article we present our results from waste nuclear flows simulation during decommissioning process, including discrete-event simulation supported by FLEXSIM 3-D software. This approach was successfully tested and our works confirms its ability to improve this type of industrial process by identifying the critical points of the chain and optimizing it by identifying improvement actions. This type of simulation, executed before the start of the process operations on the basis of a first conception, allow ‘what-if’ process evaluation and help to ensure quality of the process in an uncertain context. The simulation of nuclear waste flows before evacuation from the site will help reducing the cost and duration of the decommissioning process by optimizing the planning and the use of resources, transitional storage and expensive radioactive waste containers. Additional benefits are expected for the governance system of the waste evacuation since it will enable a shared responsibility of the waste flows.Keywords: nuclear decommissioning, logistical optimization, decision-support framework, waste management
Procedia PDF Downloads 323