Search results for: open chromatin regions
1230 Development of Family Quality of Life Scale for a Family Which Has a Person with Disability: Results of a Delphi Study
Authors: Thirakorn Maneerat, Darunee Jongudomkarn, Jiraporn Khiewyoo
Abstract:
Family quality of life of families who have persons with disabilities is a core concern in government services and community health promotion to deal with the multidimensionality of today’s health and societal issues. The number of families who have persons with disabilities in Thailand is gradually increasing. However, facilitation and evaluation of such family quality of life are limited by the lack of feasible tools. As a consequence, service provided for the families is not optimally facilitated and evaluated. This paper is part of a larger project which is aimed to develop a scale for measuring of family quality of life of families who have persons with developmental disabilities in Thailand, presenting the results of a three-round Delphi method involving 11 experts. The study was obtained during December 2013 to May 2014. The first round consisted of open-ended questionnaire and content analysis of the answers. The second round comprised a 5-point Likert scale structured questionnaire based on the first round analysis, with required the experts to identify the most relevant studied tool aspects. Their feedbacks levels of agreements were statistic analysis using the median, interquartile range and quartile deviation. The included criteria for items acceptance were greater than 3.50 of the median, lesser than 1.50 of interquartile range, and 0.65 or less of a quartile deviation. Finally, the proposed questionnaire was structured and validated by the experts in the third round. The results found that across all three rounds, the experts achieved 100% agreement on the five factors regarding to quality of life of a family who have person with disability were considered. These five factors with 38 items were included: 1) 10 items of family interactions; 2) 9 items of child rearing; 3) 7 items of physical and material resources; 4) 5 items of social-emotional status; and 7 items of disability-related services and welfare. Next step of the study was examined the construct validity by using factor analysis methods.Keywords: tool development, family quality of life scale, person with disability, Delphi study
Procedia PDF Downloads 3531229 Evaluation of NoSQL in the Energy Marketplace with GraphQL Optimization
Authors: Michael Howard
Abstract:
The growing popularity of electric vehicles in the United States requires an ever-expanding infrastructure of commercial DC fast charging stations. The U.S. Department of Energy estimates 33,355 publicly available DC fast charging stations as of September 2023. In 2017, 115,370 gasoline stations were operating in the United States, much more ubiquitous than DC fast chargers. Range anxiety is an important impediment to the adoption of electric vehicles and is even more relevant in underserved regions in the country. The peer-to-peer energy marketplace helps fill the demand by allowing private home and small business owners to rent their 240 Volt, level-2 charging facilities. The existing, publicly accessible outlets are wrapped with a Cloud-connected microcontroller managing security and charging sessions. These microcontrollers act as Edge devices communicating with a Cloud message broker, while both buyer and seller users interact with the framework via a web-based user interface. The database storage used by the marketplace framework is a key component in both the cost of development and the performance that contributes to the user experience. A traditional storage solution is the SQL database. The architecture and query language have been in existence since the 1970s and are well understood and documented. The Structured Query Language supported by the query engine provides fine granularity with user query conditions. However, difficulty in scaling across multiple nodes and cost of its server-based compute have resulted in a trend in the last 20 years towards other NoSQL, serverless approaches. In this study, we evaluate the NoSQL vs. SQL solutions through a comparison of Google Cloud Firestore and Cloud SQL MySQL offerings. The comparison pits Google's serverless, document-model, non-relational, NoSQL against the server-base, table-model, relational, SQL service. The evaluation is based on query latency, flexibility/scalability, and cost criteria. Through benchmarking and analysis of the architecture, we determine whether Firestore can support the energy marketplace storage needs and if the introduction of a GraphQL middleware layer can overcome its deficiencies.Keywords: non-relational, relational, MySQL, mitigate, Firestore, SQL, NoSQL, serverless, database, GraphQL
Procedia PDF Downloads 621228 New Environmentally Friendly Material for the Purification of the Fresh Water from Oil Pollution
Authors: M. A. Ashour
Abstract:
As it is known Egypt is one of the countries having oldest sugarcane industry, which goes back to the year 710 AD. Cane plantations are the main agricultural product in five governorates in Upper Egypt (El-Menia, Sohag, Qena, Luxor, and Aswan), producing not less than 16 million tons a year. Eight factories (Abou-korkas, Gena, Nagaa-Hamadi, Deshna, Kous, Armant, Edfuo, and Komombo), located in such upper Egypt governorates generates huge amount of wastes during the manufacturing stage, the so called bagasse which is the fibrous, and cellulosic materials remaining after the era of the sugarcane and the juice extraction, presents about 30% of such wastes. The amount of bagasse generated yearly through the manufacturing stage of the above mentioned 8 factories is approximately about 2.8 million tons, getting red safely of such huge amount, presents a serious environmental problem. Storage of that material openly in the so hot climate in upper Egypt, may cause its self-ignition under air temperature reaches 50 degrees centigrade in summer, due to the remained residual content of sugar. At the same time preparing places for safely storage for such amount is very expensive with respect to the valueless of it. So the best way for getting rid of bagasse is converting it into an added value environmentally friendly material, especially till now the utilization of it is so limited. Since oil pollution became a serious concern, the issue of environmental cleaning arises. With the structure of sugarcane bagasse, which contains fiber and high content of carbon, it can be an adsorbent to adsorb the oil contamination from the water. The present study is a trail to introduce a new material for the purification of water systems to score two goals at once, the first is getting rid of that harmful waste safely, the second is converting it to a commercial valuable material for cleaning, and purifying the water from oil spills, and petroleum pollution. Introduced the new material proved very good performance, and higher efficiency than other similar materials available in the local market, in both closed and open systems. The introduced modified material can absorb 10 times its weight of oil, while don't absorb any water.Keywords: environment, water resources, agricultural wastes, oil pollution control, sugarcane
Procedia PDF Downloads 1891227 Investigation of FoxM1 Gene Expression in Breast Cancer and Its Relationship with miR-216B-5p Expression Level
Authors: Ramin Mehdiabadi
Abstract:
Background: Breast cancer remains the most prevalent cancer diagnosis and the leading cause of cancer death among women globally, representing 11.7% of new cases and 6.9% of deaths. While the incidence and mortality of major cancers are declining in developed regions like the United States and Western Europe, underdeveloped and developing countries exhibit an increasing trend, attributed to lifestyle factors such as smoking, physical inactivity, and high-calorie diets. Objective: This study explores the intricate relationship between the mammalian transcription factor forkhead box (FoxM1) and the microRNA miR-216b-5p in various subtypes of breast cancer, aiming to deepen the understanding of their roles in tumorigenesis, metastasis, and drug resistance. Methods: Breast cancer subtypes were categorized based on key biomarkers: estrogen receptors, progesterone receptors, and human epidermal growth factor receptor 2. These include luminal A, luminal B, HER2 enriched, triple-negative, and normal-like subtypes. We focused on analyzing the expression levels of FoxM1 and miR-216b-5p, given the known role of FoxM1 in cell proliferation and its implications in cancer pathologies such as lung, gastric, and breast cancers. Concurrently, miR-216b-5p's function as a tumor suppressor was evaluated to ascertain its regulatory effects on FoxM1. Results: Preliminary data indicate a nuanced interplay between FoxM1 and miR-216b-5p, suggesting a potential inverse relationship that varies across breast cancer subtypes. This relationship underscores the dual role of these biomarkers in modulating cancer progression and response to treatments. Conclusion: The findings advocate for the potential of miR-216b-5p to serve as a prognostic biomarker and a therapeutic target, particularly in subtypes where FoxM1 is prominently expressed. Understanding these molecular interactions provides crucial insights into the personalized treatment strategies and could lead to more effective therapeutic interventions in breast cancer management. Implications: The study highlights the importance of molecular profiling in breast cancer treatment and emphasizes the need for targeted therapeutic approaches in managing diverse cancer subtypes, particularly in varying global contexts where lifestyle factors significantly impact cancer dynamics.Keywords: breast cancer, gene expression, FoxM1, microRNA
Procedia PDF Downloads 551226 Experimental Investigation of Nano-Enhanced-PCM-Based Heat Sinks for Passive Thermal Management of Small Satellites
Authors: Billy Moore, Izaiah Smith, Dominic Mckinney, Andrew Cisco, Mehdi Kabir
Abstract:
Phase-change materials (PCMs) are considered one of the most promising substances to be engaged passively in thermal management and storage systems for spacecraft, where it is critical to diminish the overall mass of the onboard thermal storage system while minimizing temperature fluctuations upon drastic changes in the environmental temperature within the orbit stage. This makes the development of effective thermal management systems more challenging since there is no atmosphere in outer space to take advantage of natural and forced convective heat transfer. PCM can store or release a tremendous amount of thermal energy within a small volume in the form of latent heat of fusion in the phase-change processes of melting and solidification from solid to liquid or, conversely, during which temperature remains almost constant. However, the existing PCMs pose very low thermal conductivity, leading to an undesirable increase in total thermal resistance and, consequently, a slow thermal response time. This often turns into a system bottleneck from the thermal performance perspective. To address the above-mentioned drawback, the present study aims to design and develop various heat sinks featured by nano-structured graphitic foams (i.e., carbon foam), expanded graphite (EG), and open-cell copper foam (OCCF) infiltrated with a conventional paraffin wax PCM with a melting temperature of around 35 °C. This study focuses on the use of passive thermal management techniques to develop efficient heat sinks to maintain the electronics circuits’ and battery module’s temperature within the thermal safety limit for small spacecraft and satellites such as the Pumpkin and OPTIMUS battery modules designed for CubeSats with a cross-sectional area of approximately 4˝×4˝. Thermal response times for various heat sinks are assessed in a vacuum chamber to simulate space conditions.Keywords: heat sink, porous foams, phase-change material (PCM), spacecraft thermal management
Procedia PDF Downloads 171225 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET
Authors: Tyler T. Procko, Steve Collins
Abstract:
New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.Keywords: API data access, database, JSON, .NET core, SQL server
Procedia PDF Downloads 661224 Storage Assignment Strategies to Reduce Manual Picking Errors with an Emphasis on an Ageing Workforce
Authors: Heiko Diefenbach, Christoph H. Glock
Abstract:
Order picking, i.e., the order-based retrieval of items in a warehouse, is an important time- and cost-intensive process for many logistic systems. Despite the ongoing trend of automation, most order picking systems are still manual picker-to-parts systems, where human pickers walk through the warehouse to collect ordered items. Human work in warehouses is not free from errors, and order pickers may at times pick the wrong or the incorrect number of items. Errors can cause additional costs and significant correction efforts. Moreover, age might increase a person’s likelihood to make mistakes. Hence, the negative impact of picking errors might increase for an aging workforce currently witnessed in many regions globally. A significant amount of research has focused on making order picking systems more efficient. Among other factors, storage assignment, i.e., the assignment of items to storage locations (e.g., shelves) within the warehouse, has been subject to optimization. Usually, the objective is to assign items to storage locations such that order picking times are minimized. Surprisingly, there is a lack of research concerned with picking errors and respective prevention approaches. This paper hypothesize that the storage assignment of items can affect the probability of pick errors. For example, storing similar-looking items apart from one other might reduce confusion. Moreover, storing items that are hard to count or require a lot of counting at easy-to-access and easy-to-comprehend self heights might reduce the probability to pick the wrong number of items. Based on this hypothesis, the paper discusses how to incorporate error-prevention measures into mathematical models for storage assignment optimization. Various approaches with respective benefits and shortcomings are presented and mathematically modeled. To investigate the newly developed models further, they are compared to conventional storage assignment strategies in a computational study. The study specifically investigates how the importance of error prevention increases with pickers being more prone to errors due to age, for example. The results suggest that considering error-prevention measures for storage assignment can reduce error probabilities with only minor decreases in picking efficiency. The results might be especially relevant for an aging workforce.Keywords: an aging workforce, error prevention, order picking, storage assignment
Procedia PDF Downloads 2041223 Proposals of Exposure Limits for Infrasound From Wind Turbines
Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz
Abstract:
Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).Keywords: infrasound, exposure limit, hearing thresholds, wind turbines
Procedia PDF Downloads 831222 The Investment Decision-Making Principles in Regional Tourism
Authors: Evgeni Baratashvili, Giorgi Sulashvili, Malkhaz Sulashvili, Bela Khotenashvili, Irma Makharashvili
Abstract:
The most investment decision-making principle of regional travel firm's management and its partner is the formulation of the aims of investment programs. The investments can be targeted in order to reduce the firm's production costs and to purchase good transport equipment. In attractive region, in order to develop firm’s activities, the investment program can be targeted for increasing of provided services. That is the case where the sales already have been used in the market. The investment can be directed to establish the affiliate firms, branches, to construct new hotels, to create food and trade enterprises, to develop entertainment enterprises, etc. Economic development is of great importance to regional development. International experience shows that inclusive economic growth largely depends on not only the national, but also regional development planning and implementation of a strong and competitive regions. Regional development is considered as the key factor in achieving national success. Establishing a modern institute separate entities if the pilot centers will constitute a promotion, international best practice-based public-private partnership to encourage the use of models. Regional policy directions and strategies adopted in accordance with the successful implementation of major importance in the near future specific action plans for inclusive development and implementation, which will be provided in accordance with the effective monitoring and evaluation tools and measurable indicators combined. All of these above-mentioned investments are characterized by different levels, which are related to the following fact: How successful tourism marketing service is, whether it is able to determine the proper market's reaction according to the particular firm's actions. In the sphere of regional tourism industry and in the investment decision possible variants it can be developed the some specter of models. Each of the models can be modified and specified according to the situation, and characteristic skills of the existing problem that must be solved. Besides, while choosing the proper model, the process is affected by the regulation system of economic processes. Also, it is influenced by liberalization quality and by the level of state participation.Keywords: net income of travel firm, economic growth, Investment profitability, regional development, tourist product, tourism development
Procedia PDF Downloads 2601221 Development of Bioplastic Disposable Food Packaging from Starch and Cellulose
Authors: Lidya Hailu, Ramesh Duraisamy, Masood Akhtar Khan, Belete Yilma
Abstract:
Disposable food packaging is a single-use plastics that can include any disposable plastic item which could be designed and use only once. In this context, this study aimed to prepare and evaluate bioplastic food packaging material from avocado seed starch and sugarcane bagasse cellulose and to characterise avocado seed starch. Performed the physicomechanical, structural, thermal properties, and biodegradability of raw materials and readily prepared bioplastic using the universal tensile testing machine, FTIR, UV-Vis spectroscopy, TGA, XRD, and SEM. Results have shown that an increasing amount of glycerol (3-5 mL) resulted in increases in water absorption, density, water vapor permeability, and elongation at the break of prepared bioplastic. However, it causes decreases in % transmittance, thermal degradation, and the tensile strength of prepared bioplastic. Likewise, the addition of cellulose fiber (0-15 %) increases % transmittance ranged (91.34±0.12-63.03±0.05 %), density (0.93±0.04-1.27±0.02 g/cm3), thermal degradation (310.01-321.61°C), tensile strength (2.91±6.18-4.21±6.713 MPa) of prepared bioplastic. On the other hand, it causes decreases in water absorption (14.4±0.25-9.40±0.007 %), water vapor permeability (9.306x10-12±0.3-3.57x10-12±0.15 g•s−1•m−1•Pa−1) and elongation at break (34.46±3.37-27.63±5.67 %) of prepared bioplastic. All the readily prepared bioplastic films rapidly degraded in the soil in the first 6 days and decompose within 12 days with a diminutive leftover and completely degraded within 15 days under an open soil atmosphere. Studied results showed starch derived bioplastic reinforced with 15 % cellulose fiber that plasticized with 3 mL of glycerol had improved results than other combinations of glycerol and bagasse cellulose with avocado seed starch. Thus, biodegradable disposable food packaging cup has been successfully produced in the lab-scale level using the studied approach. Biodegradable disposable food packaging materials have been successfully produced by employing avocado seed starch and sugarcane bagasse cellulose. The future study should be done on nano scale production since this study was done at the micro level.Keywords: avocado seed, food packaging, glycerol, sugarcane bagasse
Procedia PDF Downloads 3381220 Improving Rural Access to Specialist Emergency Mental Health Care: Using a Time and Motion Study in the Evaluation of a Telepsychiatry Program
Authors: Emily Saurman, David Lyle
Abstract:
In Australia, a well serviced rural town might have a psychiatrist visit once-a-month with more frequent visits from a psychiatric nurse, but many have no resident access to mental health specialists. Access to specialist care, would not only reduce patient distress and benefit outcomes, but facilitate the effective use of limited resources. The Mental Health Emergency Care-Rural Access Program (MHEC-RAP) was developed to improve access to specialist emergency mental health care in rural and remote communities using telehealth technologies. However, there has been no current benchmark to gauge program efficiency or capacity; to determine whether the program activity is justifiably sufficient. The evaluation of MHEC-RAP used multiple methods and applied a modified theory of access to assess the program and its aim of improved access to emergency mental health care. This was the first evaluation of a telepsychiatry service to include a time and motion study design examining program time expenditure, efficiency, and capacity. The time and motion study analysis was combined with an observational study of the program structure and function to assess the balance between program responsiveness and efficiency. Previous program studies have demonstrated that MHEC-RAP has improved access and is used and effective. The findings from the time and motion study suggest that MHEC-RAP has the capacity to manage increased activity within the current model structure without loss to responsiveness or efficiency in the provision of care. Enhancing program responsiveness and efficiency will also support a claim of the program’s value for money. MHEC-RAP is a practical telehealth solution for improving access to specialist emergency mental health care. The findings from this evaluation have already attracted the attention of other regions in Australia interested in implementing emergency telepsychiatry programs and are now informing the progressive establishment of mental health resource centres in rural New South Wales. Like MHEC-RAP, these centres will provide rapid, safe, and contextually relevant assessments and advice to support local health professionals to manage mental health emergencies in the smaller rural emergency departments. Sharing the application of this methodology and research activity may help to improve access to and future evaluations of telehealth and telepsychiatry services for others around the globe.Keywords: access, emergency, mental health, rural, time and motion
Procedia PDF Downloads 2341219 Enhancing Teaching of Engineering Mathematics
Authors: Tajinder Pal Singh
Abstract:
Teaching of mathematics to engineering students is an open ended problem in education. The main goal of mathematics learning for engineering students is the ability of applying a wide range of mathematical techniques and skills in their engineering classes and later in their professional work. Most of the undergraduate engineering students and faculties feels that no efforts and attempts are made to demonstrate the applicability of various topics of mathematics that are taught thus making mathematics unavoidable for some engineering faculty and their students. The lack of understanding of concepts in engineering mathematics may hinder the understanding of other concepts or even subjects. However, for most undergraduate engineering students, mathematics is one of the most difficult courses in their field of study. Most of the engineering students never understood mathematics or they never liked it because it was too abstract for them and they could never relate to it. A right balance of application and concept based teaching can only fulfill the objectives of teaching mathematics to engineering students. It will surely improve and enhance their problem solving and creative thinking skills. In this paper, some practical (informal) ways of making mathematics-teaching application based for the engineering students is discussed. An attempt is made to understand the present state of teaching mathematics in engineering colleges. The weaknesses and strengths of the current teaching approach are elaborated. Some of the causes of unpopularity of mathematics subject are analyzed and a few pragmatic suggestions have been made. Faculty in mathematics courses should spend more time discussing the applications as well as the conceptual underpinnings rather than focus solely on strategies and techniques to solve problems. They should also introduce more ‘word’ problems as these problems are commonly encountered in engineering courses. Overspecialization in engineering education should not occur at the expense of (or by diluting) mathematics and basic sciences. The role of engineering education is to provide the fundamental (basic) knowledge and to teach the students simple methodology of self-learning and self-development. All these issues would be better addressed if mathematics and engineering faculty join hands together to plan and design the learning experiences for the students who take their classes. When faculties stop competing against each other and start competing against the situation, they will perform better. Without creating any administrative hassles these suggestions can be used by any young inexperienced faculty of mathematics to inspire engineering students to learn engineering mathematics effectively.Keywords: application based learning, conceptual learning, engineering mathematics, word problem
Procedia PDF Downloads 2321218 Robust Numerical Method for Singularly Perturbed Semilinear Boundary Value Problem with Nonlocal Boundary Condition
Authors: Habtamu Garoma Debela, Gemechis File Duressa
Abstract:
In this work, our primary interest is to provide ε-uniformly convergent numerical techniques for solving singularly perturbed semilinear boundary value problems with non-local boundary condition. These singular perturbation problems are described by differential equations in which the highest-order derivative is multiplied by an arbitrarily small parameter ε (say) known as singular perturbation parameter. This leads to the existence of boundary layers, which are basically narrow regions in the neighborhood of the boundary of the domain, where the gradient of the solution becomes steep as the perturbation parameter tends to zero. Due to the appearance of the layer phenomena, it is a challenging task to provide ε-uniform numerical methods. The term 'ε-uniform' refers to identify those numerical methods in which the approximate solution converges to the corresponding exact solution (measured to the supremum norm) independently with respect to the perturbation parameter ε. Thus, the purpose of this work is to develop, analyze, and improve the ε-uniform numerical methods for solving singularly perturbed problems. These methods are based on nonstandard fitted finite difference method. The basic idea behind the fitted operator, finite difference method, is to replace the denominator functions of the classical derivatives with positive functions derived in such a way that they capture some notable properties of the governing differential equation. A uniformly convergent numerical method is constructed via nonstandard fitted operator numerical method and numerical integration methods to solve the problem. The non-local boundary condition is treated using numerical integration techniques. Additionally, Richardson extrapolation technique, which improves the first-order accuracy of the standard scheme to second-order convergence, is applied for singularly perturbed convection-diffusion problems using the proposed numerical method. Maximum absolute errors and rates of convergence for different values of perturbation parameter and mesh sizes are tabulated for the numerical example considered. The method is shown to be ε-uniformly convergent. Finally, extensive numerical experiments are conducted which support all of our theoretical findings. A concise conclusion is provided at the end of this work.Keywords: nonlocal boundary condition, nonstandard fitted operator, semilinear problem, singular perturbation, uniformly convergent
Procedia PDF Downloads 1431217 Evaluation of Potential of Crop Residues for Energy Generation in Nepal
Authors: Narayan Prasad Adhikari
Abstract:
In Nepal, the crop residues have often been considered as one of the potential sources of energy to cope with prevailing energy crisis. However, the lack of systematic studies about production and various other competent uses of crop production is the main obstacle to evaluate net potential of the residues for energy production. Under this background, this study aims to assess the net annual availability of crop residues for energy production by undertaking three different districts with the representation of country’s three major regions of lowland, hill, and mountain. The five major cereal crops of paddy, wheat, maize, millet, and barley are considered for the analysis. The analysis is based upon two modes of household surveys. The first mode of survey is conducted to total of 240 households to obtain key information about crop harvesting and livestock management throughout a year. Similarly, the quantification of main crops along with the respective residues on fixed land is carried out to 45 households during second mode. The range of area of such fixed land is varied from 50 to 100 m2. The measurements have been done in air dry basis. The quantity for competitive uses of respective crop residues is measured on the basis of respondents’ feedback. There are four major competitive uses of crop residues at household which are building material, burning, selling, and livestock fodder. The results reveal that the net annual available crop residues per household are 4663 kg, 2513 kg, and 1731 kg in lowland, hill, and mountain respectively. Of total production of crop residues, the shares of dedicated fodder crop residues (except maize stalk and maize cob) are 94 %, 62 %, and 89 % in lowland, hill, and mountain respectively and of which the corresponding shares of fodder are 87 %, 91 %, and 82 %. The annual percapita energy equivalent from net available crop residues in lowland, hill, and mountain are 2.49 GJ, 3.42 GJ, and 0.44 GJ which represent 30 %, 33 %, and 3 % of total annual energy consumption respectively whereas the corresponding current shares of crop residues are only 23 %, 8 %, and 1 %. Hence, even utmost exploitation of available crop residues can hardly contribute to one third of energy consumption at household level in lowland, and hill whereas this is limited to particularly negligible in mountain. Moreover, further analysis has also been done to evaluate district wise supply-demand context of dedicated fodder crop residues on the basis of presence of livestock. The high deficit of fodder crop residues in hill and mountain is observed where the issue of energy generation from these residues will be ludicrous. As a contrary, the annual production of such residues for livestock fodder in lowland meets annual demand with modest surplus even if entire fodder to be derived from the residues throughout a year and thus there seems to be further potential to utilize the surplus residues for energy generation.Keywords: crop residues, hill, lowland, mountain
Procedia PDF Downloads 4721216 Investigating the Sloshing Characteristics of a Liquid by Using an Image Processing Method
Authors: Ufuk Tosun, Reza Aghazadeh, Mehmet Bülent Özer
Abstract:
This study puts forward a method to analyze the sloshing characteristics of liquid in a tuned sloshing absorber system by using image processing tools. Tuned sloshing vibration absorbers have recently attracted researchers’ attention as a seismic load damper in constructions due to its practical and logistical convenience. The absorber is liquid which sloshes and applies a force in opposite phase to the motion of structure. Experimentally characterization of the sloshing behavior can be utilized as means of verifying the results of numerical analysis. It can also be used to identify the accuracy of assumptions related to the motion of the liquid. There are extensive theoretical and experimental studies in the literature related to the dynamical and structural behavior of tuned sloshing dampers. In most of these works there are efforts to estimate the sloshing behavior of the liquid such as free surface motion and total force applied by liquid to the wall of container. For these purposes the use of sensors such as load cells and ultrasonic sensors are prevalent in experimental works. Load cells are only capable of measuring the force and requires conducting tests both with and without liquid to obtain pure sloshing force. Ultrasonic level sensors give point-wise measurements and hence they are not applicable to measure the whole free surface motion. Furthermore, in the case of liquid splashing it may give incorrect data. In this work a method for evaluating the sloshing wave height by using camera records and image processing techniques is presented. In this method the motion of the liquid and its container, made of a transparent material, is recorded by a high speed camera which is aligned to the free surface of the liquid. The video captured by the camera is processed frame by frame by using MATLAB Image Processing toolbox. The process starts with cropping the desired region. By recognizing the regions containing liquid and eliminating noise and liquid splashing, the final picture depicting the free surface of liquid is achieved. This picture then is used to obtain the height of the liquid through the length of container. This process is verified by ultrasonic sensors that measured fluid height on the surface of liquid.Keywords: fluid structure interaction, image processing, sloshing, tuned liquid damper
Procedia PDF Downloads 3441215 Analyses of Defects in Flexible Silicon Photovoltaic Modules via Thermal Imaging and Electroluminescence
Authors: S. Maleczek, K. Drabczyk, L. Bogdan, A. Iwan
Abstract:
It is known that for industrial applications using solar panel constructed from silicon solar cells require high-efficiency performance. One of the main problems in solar panels is different mechanical and structural defects, causing the decrease of generated power. To analyse defects in solar cells, various techniques are used. However, the thermal imaging is fast and simple method for locating defects. The main goal of this work was to analyze defects in constructed flexible silicon photovoltaic modules via thermal imaging and electroluminescence method. This work is realized for the GEKON project (No. GEKON2/O4/268473/23/2016) sponsored by The National Centre for Research and Development and The National Fund for Environmental Protection and Water Management. Thermal behavior was observed using thermographic camera (VIGOcam v50, VIGO System S.A, Poland) using a DC conventional source. Electroluminescence was observed by Steinbeis Center Photovoltaics (Stuttgart, Germany) equipped with a camera, in which there is a Si-CCD, 16 Mpix detector Kodak KAF-16803type. The camera has a typical spectral response in the range 350 - 1100 nm with a maximum QE of 60 % at 550 nm. In our work commercial silicon solar cells with the size 156 × 156 mm were cut for nine parts (called single solar cells) and used to create photovoltaic modules with the size of 160 × 70 cm (containing about 80 single solar cells). Flexible silicon photovoltaic modules on polyamides or polyester fabric were constructed and investigated taking into consideration anomalies on the surface of modules. Thermal imaging provided evidence of visible voltage-activated conduction. In electro-luminescence images, two regions are noticeable: darker, where solar cell is inactive and brighter corresponding with correctly working photovoltaic cells. The electroluminescence method is non-destructive and gives greater resolution of images thereby allowing a more precise evaluation of microcracks of solar cell after lamination process. Our study showed good correlations between defects observed by thermal imaging and electroluminescence. Finally, we can conclude that the thermographic examination of large scale photovoltaic modules allows us the fast, simple and inexpensive localization of defects at the single solar cells and modules. Moreover, thermographic camera was also useful to detection electrical interconnection between single solar cells.Keywords: electro-luminescence, flexible devices, silicon solar cells, thermal imaging
Procedia PDF Downloads 3161214 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies
Authors: Paolino Di Felice
Abstract:
The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.Keywords: quality of life, distance measurement error, Italian administrative units, spatial database
Procedia PDF Downloads 3711213 Floodnet: Classification for Post Flood Scene with a High-Resolution Aerial Imaginary Dataset
Authors: Molakala Mourya Vardhan Reddy, Kandimala Revanth, Koduru Sumanth, Beena B. M.
Abstract:
Emergency response and recovery operations are severely hampered by natural catastrophes, especially floods. Understanding post-flood scenarios is essential to disaster management because it facilitates quick evaluation and decision-making. To this end, we introduce FloodNet, a brand-new high-resolution aerial picture collection created especially for comprehending post-flood scenes. A varied collection of excellent aerial photos taken during and after flood occurrences make up FloodNet, which offers comprehensive representations of flooded landscapes, damaged infrastructure, and changed topographies. The dataset provides a thorough resource for training and assessing computer vision models designed to handle the complexity of post-flood scenarios, including a variety of environmental conditions and geographic regions. Pixel-level semantic segmentation masks are used to label the pictures in FloodNet, allowing for a more detailed examination of flood-related characteristics, including debris, water bodies, and damaged structures. Furthermore, temporal and positional metadata improve the dataset's usefulness for longitudinal research and spatiotemporal analysis. For activities like flood extent mapping, damage assessment, and infrastructure recovery projection, we provide baseline standards and evaluation metrics to promote research and development in the field of post-flood scene comprehension. By integrating FloodNet into machine learning pipelines, it will be easier to create reliable algorithms that will help politicians, urban planners, and first responders make choices both before and after floods. The goal of the FloodNet dataset is to support advances in computer vision, remote sensing, and disaster response technologies by providing a useful resource for researchers. FloodNet helps to create creative solutions for boosting communities' resilience in the face of natural catastrophes by tackling the particular problems presented by post-flood situations.Keywords: image classification, segmentation, computer vision, nature disaster, unmanned arial vehicle(UAV), machine learning.
Procedia PDF Downloads 791212 Beyond the Tragedy of Absence: Vizenor's Comedy of Native Presence
Authors: Mahdi Sepehrmanesh
Abstract:
This essay explores Gerald Vizenor's innovative concepts of the tragedy of absence and the comedy of presence as frameworks for understanding and challenging dominant narratives about Native American identity and history. Vizenor's work critiques the notion of irrevocable cultural loss and rigid definitions of Indigenous identity based on blood quantum and stereotypical practices. Through subversive humor, trickster figures, and storytelling, Vizenor asserts the active presence and continuance of Native peoples, advocating for a dynamic, self-determined understanding of Native identity. The essay examines Vizenor's use of postmodern techniques, including his engagement with simulation and hyperreality, to disrupt colonial discourses and create new spaces for Indigenous expression. It explores the concept of "crossblood" identities as a means of resisting essentialist notions of Native authenticity and embracing the complexities of contemporary Indigenous experiences. Vizenor's ideas of survivance and transmotion are analyzed as strategies for cultural resilience and adaptation in the face of ongoing colonial pressures. The interplay between absence and presence in Vizenor's work is discussed, particularly through the lens of shadow survivance and the power of storytelling. The essay also delves into Vizenor's critique of terminal creed and his promotion of natural reason as an alternative epistemology to Western rationalism. While acknowledging the significant influence of Vizenor's work on Native American literature and theory, the essay also addresses critiques of his approach, including concerns about the accessibility of his writing and its political effectiveness. Despite these debates, the essay argues that Vizenor's concepts offer a powerful vision of Indigenous futurity that is rooted in tradition yet open to change, inspiring hope and agency in the face of oppression. By examining Vizenor's multifaceted approach to Native American identity and presence, this essay contributes to ongoing discussions about Indigenous representation, cultural continuity, and resistance to colonial narratives in literature and beyond.Keywords: gerald vizenor, identity native american literature, survivance, trickster discourse, identity
Procedia PDF Downloads 351211 The h3r Antagonist E159 Alleviates Neuroinflammation and Autistic-Like Phenotypes in BTBR T+ tf/J Mouse Model of Autism
Authors: Shilu Deepa Thomas, P. Jayaprakash, Dorota Łazewska, Katarzyna Kieć-Kononowicz, B. Sadek
Abstract:
A large body of evidence suggests the involvement of cognitive impairment, increased levels of inflammation and oxidative stress in the pathogenesis of autism spectrum disorder (ASD). ASD commonly coexists with psychiatric conditions like anxiety and cognitive challenges, and individuals with ASD exhibit significant levels of inflammation and immune system dysregulation. Previous Studies have identified elevated levels of pro-inflammatory markers such as IL-1β, IL-6, IL-2 and TNF-α, particularly in young children with ASD. The current therapeutic options for ASD show limited effectiveness, signifying the importance of exploring an efficient drugs to address the core symptoms. The role of histamine H3 receptors (H3Rs) in memory and the prospective role of H3R antagonists in pharmacological control of neurodegenerative disorders, e.g., ASD, is well-accepted. Hence, the effects of chronic systemic administration of H3R antagonist E159 on autistic-like repetitive behaviors, social deficits, memory and anxiety parameters, as well as neuroinflammation in Black and Tan BRachyury (BTBR) mice, were evaluated using Y maze, Barnes maze, self-grooming, open field and three chamber social test. E159 (2.5, 5 and 10 mg/kg, i.p.) dose-dependently ameliorated repetitive and compulsive behaviors by reducing the increased time spent in self-grooming and improved reduced spontaneous alternation in BTBR mice. Moreover, treatment with E159 attenuated disturbed anxiety levels and social deficits in tested male BTBR mice. Furthermore, E159 attenuated oxidative stress by significantly increasing GSH, CAT, and SOD and decreasing the increased levels of MDA in the cerebellum as well as the hippocampus. In addition, E159 decreased the elevated levels of proinflammatory cytokines (tumor necrosis factor (TNF-α), interleukin-1β (IL-1β), and IL-6). The observed results show that H3R antagonists like E159 may represent a promising novel pharmacological strategy for the future treatment of ASD.Keywords: histamine H3 receptors, antagonist E159, autism, behaviors, mice
Procedia PDF Downloads 661210 Seismic Retrofit of Tall Building Structure with Viscous, Visco-Elastic, Visco-Plastic Damper
Authors: Nicolas Bae, Theodore L. Karavasilis
Abstract:
Increasingly, a large number of new and existing tall buildings are required to improve their resilient performance against strong winds and earthquakes to minimize direct, as well as indirect damages to society. Those advent stationary functions of tall building structures in metropolitan regions can be severely hazardous, in socio-economic terms, which also increase the requirement of advanced seismic performance. To achieve these progressive requirements, the seismic reinforcement for some old, conventional buildings have become enormously costly. The methods of increasing the buildings’ resilience against wind or earthquake loads have also become more advanced. Up to now, vibration control devices, such as the passive damper system, is still regarded as an effective and an easy-to-install option, in improving the seismic resilience of buildings at affordable prices. The main purpose of this paper is to examine 1) the optimization of the shape of visco plastic brace damper (VPBD) system which is one of hybrid damper system so that it can maximize its energy dissipation capacity in tall buildings against wind and earthquake. 2) the verification of the seismic performance of the visco plastic brace damper system in tall buildings; up to forty-storey high steel frame buildings, by comparing the results of Non-Linear Response History Analysis (NLRHA), with and without a damper system. The most significant contribution of this research is to introduce the optimized hybrid damper system that is adequate for high rise buildings. The efficiency of this visco plastic brace damper system and the advantages of its use in tall buildings can be verified since tall buildings tend to be affected by wind load at its normal state and also by earthquake load after yielding of steel plates. The modeling of the prototype tall building will be conducted using the Opensees software. Three types of modeling were used to verify the performance of the damper (MRF, MRF with visco-elastic, MRF with visco-plastic model) 22-set seismic records used and the scaling procedure was followed according to the FEMA code. It is shown that MRF with viscous, visco-elastic damper, it is superior effective to reduce inelastic deformation such as roof displacement, maximum story drift, roof velocity compared to the MRF only.Keywords: tall steel building, seismic retrofit, viscous, viscoelastic damper, performance based design, resilience based design
Procedia PDF Downloads 1931209 The Impact of Artificial Intelligence on Legislations and Laws
Authors: Keroles Akram Saed Ghatas
Abstract:
The near future will bring significant changes in modern organizations and management due to the growing role of intangible assets and knowledge workers. The area of copyright, intellectual property, digital (intangible) assets and media redistribution appears to be one of the greatest challenges facing business and society in general and management sciences and organizations in particular. The proposed article examines the views and perceptions of fairness in digital media sharing among Harvard Law School's LL.M.s. Students, based on 50 qualitative interviews and 100 surveys. The researcher took an ethnographic approach to her research and entered the Harvard LL.M. in 2016. at, a Face book group that allows people to connect naturally and attend in-person and private events more easily. After listening to numerous students, the researcher conducted a quantitative survey among 100 respondents to assess respondents' perceptions of fairness in digital file sharing in various contexts (based on media price, its availability, regional licenses, copyright holder status, etc.). to understand better . .). Based on the survey results, the researcher conducted long-term, open-ended and loosely structured ethnographic interviews (50 interviews) to further deepen the understanding of the results. The most important finding of the study is that Harvard lawyers generally support digital piracy in certain contexts, despite having the best possible legal and professional knowledge. Interestingly, they are also more accepting of working for the government than the private sector. The results of this study provide a better understanding of how “fairness” is perceived by the younger generation of lawyers and pave the way for a more rational application of licensing laws.Keywords: cognitive impairments, communication disorders, death penalty, executive function communication disorders, cognitive disorders, capital murder, executive function death penalty, egyptian law absence, justice, political cases piracy, digital sharing, perception of fairness, legal profession
Procedia PDF Downloads 651208 Spatial Analysis as a Tool to Assess Risk Management in Peru
Authors: Josué Alfredo Tomas Machaca Fajardo, Jhon Elvis Chahua Janampa, Pedro Rau Lavado
Abstract:
A flood vulnerability index was developed for the Piura River watershed in northern Peru using Principal Component Analysis (PCA) to assess flood risk. The official methodology to assess risk from natural hazards in Peru was introduced in 1980 and proved effective for aiding complex decision-making. This method relies in part on decision-makers defining subjective correlations between variables to identify high-risk areas. While risk identification and ensuing response activities benefit from a qualitative understanding of influences, this method does not take advantage of the advent of national and international data collection efforts, which can supplement our understanding of risk. Furthermore, this method does not take advantage of broadly applied statistical methods such as PCA, which highlight central indicators of vulnerability. Nowadays, information processing is much faster and allows for more objective decision-making tools, such as PCA. The approach presented here develops a tool to improve the current flood risk assessment in the Peruvian basin. Hence, the spatial analysis of the census and other datasets provides a better understanding of the current land occupation and a basin-wide distribution of services and human populations, a necessary step toward ultimately reducing flood risk in Peru. PCA allows the simplification of a large number of variables into a few factors regarding social, economic, physical and environmental dimensions of vulnerability. There is a correlation between the location of people and the water availability mainly found in rivers. For this reason, a comprehensive vision of the population location around the river basin is necessary to establish flood prevention policies. The grouping of 5x5 km gridded areas allows the spatial analysis of flood risk rather than assessing political divisions of the territory. The index was applied to the Peruvian region of Piura, where several flood events occurred in recent past years, being one of the most affected regions during the ENSO events in Peru. The analysis evidenced inequalities for the access to basic services, such as water, electricity, internet and sewage, between rural and urban areas.Keywords: assess risk, flood risk, indicators of vulnerability, principal component analysis
Procedia PDF Downloads 1861207 On or Off-Line: Dilemmas in Using Online Teaching-Learning in In-Service Teacher Education
Authors: Orly Sela
Abstract:
The lecture discusses a Language Teaching program in a Teacher Education College in northern Israel. An on-line course was added to the program in order to keep on-campus attendance at a minimum, thus allowing the students to keep their full-time jobs in school. In addition, the use of educational technology to allow students to study anytime anywhere, in keeping with 21st-century innovative teaching-learning practices, was also an issue, as was the wish for this course to serve as a model which the students could then possibly use in their K-12 teaching. On the other hand, there were strong considerations against including an online course in the program. The students in the program were mostly Israeli-Arab married women with young children, living in a traditional society which places a strong emphasis on the place of the woman as a wife, mother, and home-maker. In addition, as teachers, they used much of their free time on school-related tasks. Having careers at the same time as studying was ground-breaking for these women, and using their time at home for studying rather than taking care of their families may have been simply too much to ask of them. At the end of the course, feedback was collected through an online questionnaire including both open and closed questions. The data collected shows that the students believed in online teaching-learning in principle, but had trouble implementing it in practice. This evidence raised the question of whether or not such a course should be included in a graduate program for mature, professional students, particular women with families living in a traditional society. This issue is not relevant to Israel alone, but also to academic institutions worldwide serving such populations. The lecture discusses this issue, sharing the researcher’s conclusions with the audience. Based on the evidence offered, it is the researcher’s conclusion that online education should, indeed, be offered to such audiences. However, the courses should be designed with the students’ special needs in mind, with emphasis placed on initial planning and course organization based on acknowledgment of the teaching context; modeling of online teaching/learning suited for in-service teacher education, and special attention paid to social-constructivist aspects of learning.Keywords: course design, in-service teacher-education, mature students, online teaching/learning
Procedia PDF Downloads 2321206 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point
Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee
Abstract:
Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis
Procedia PDF Downloads 3431205 Postpartum Depression Screening and Referrals for Lower-Income Women in North Carolina, USA
Authors: Maren J. Coffman, Victoria C. Scott, J. Claire Schuch, Ashley N. Kelley, Jeri L. Ryan
Abstract:
Postpartum Depression (PPD) is a leading cause of postpartum morbidity. PPD affects 7.1% of postpartum women and 19.2% of postpartum women when including minor depression. Lower-income women and ethnic minorities are more at risk for developing PPD and face multiple attitudinal and institutional barriers to receiving care. This study aims to identify PPD among low-income women and connect them to appropriate services in order to reduce the illness burden and enhance access to care. Screenings were conducted in two Women, Infants, and Children (WIC) clinics in the city of Charlotte, North Carolina, USA, from April 2017 to April 2018. WIC is a supplemental nutrition program that provides healthcare and nutrition to low-income pregnant women, breastfeeding women, and children under the age of 5. Additionally, a qualitative study was conducted to better understand the PPD continuum of care in order to identify opportunities for improvement. Mothers with infants were screened for depression risk using the PHQ-2. Mothers who scored ≥ 2 completed two additional standardized screening tools (PHQ-7, to complete the PHQ-9, and the Edinburgh) to assess depressive symptomatology. If indicated they may be suffering from depression, women were referred for case management services. Open-ended questions were used to understand treatment barriers. Four weeks after the initial survey, a follow-up telephone call was made to see if women had received care. Seven focus groups with WIC staff and managers, referral agency staff, local behavioral health professionals, and students examining the screenings, are being conducted March - April, 2018 to gather information related to current screening practices, referrals, follow up and treatment. Mothers (n = 231 as of February, 2018) were screened in English (65%) or Spanish (35%). According to preliminary results, 29% of mothers screened were at risk for postpartum depression (PHQ-2 ≥ 2). There were significant differences in preliminary screening results based on survey language (Keywords: health disparities, maternal health, mental health, postpartum depression
Procedia PDF Downloads 1731204 Rendering Cognition Based Learning in Coherence with Development within the Context of PostgreSQL
Authors: Manuela Nayantara Jeyaraj, Senuri Sucharitharathna, Chathurika Senarath, Yasanthy Kanagaraj, Indraka Udayakumara
Abstract:
PostgreSQL is an Object Relational Database Management System (ORDBMS) that has been in existence for a while. Despite the superior features that it wraps and packages to manage database and data, the database community has not fully realized the importance and advantages of PostgreSQL. Hence, this research tends to focus on provisioning a better environment of development for PostgreSQL in order to induce the utilization and elucidate the importance of PostgreSQL. PostgreSQL is also known to be the world’s most elementary SQL-compliant open source ORDBMS. But, users have not yet resolved to PostgreSQL due to the facts that it is still under the layers and the complexity of its persistent textual environment for an introductory user. Simply stating this, there is a dire need to explicate an easy way of making the users comprehend the procedure and standards with which databases are created, tables and the relationships among them, manipulating queries and their flow based on conditions in PostgreSQL to help the community resolve to PostgreSQL at an augmented rate. Hence, this research under development within the context tends to initially identify the dominant features provided by PostgreSQL over its competitors. Following the identified merits, an analysis on why the database community holds a hesitance in migrating to PostgreSQL’s environment will be carried out. These will be modulated and tailored based on the scope and the constraints discovered. The resultant of the research proposes a system that will serve as a designing platform as well as a learning tool that will provide an interactive method of learning via a visual editor mode and incorporate a textual editor for well-versed users. The study is based on conjuring viable solutions that analyze a user’s cognitive perception in comprehending human computer interfaces and the behavioural processing of design elements. By providing a visually draggable and manipulative environment to work with Postgresql databases and table queries, it is expected to highlight the elementary features displayed by Postgresql over any other existent systems in order to grasp and disseminate the importance and simplicity offered by this to a hesitant user.Keywords: cognition, database, PostgreSQL, text-editor, visual-editor
Procedia PDF Downloads 2831203 Realistic Modeling of the Preclinical Small Animal Using Commercial Software
Authors: Su Chul Han, Seungwoo Park
Abstract:
As the increasing incidence of cancer, the technology and modality of radiotherapy have advanced and the importance of preclinical model is increasing in the cancer research. Furthermore, the small animal dosimetry is an essential part of the evaluation of the relationship between the absorbed dose in preclinical small animal and biological effect in preclinical study. In this study, we carried out realistic modeling of the preclinical small animal phantom possible to verify irradiated dose using commercial software. The small animal phantom was modeling from 4D Digital Mouse whole body phantom. To manipulate Moby phantom in commercial software (Mimics, Materialise, Leuven, Belgium), we converted Moby phantom to DICOM image file of CT by Matlab and two- dimensional of CT images were converted to the three-dimensional image and it is possible to segment and crop CT image in Sagittal, Coronal and axial view). The CT images of small animals were modeling following process. Based on the profile line value, the thresholding was carried out to make a mask that was connection of all the regions of the equal threshold range. Using thresholding method, we segmented into three part (bone, body (tissue). lung), to separate neighboring pixels between lung and body (tissue), we used region growing function of Mimics software. We acquired 3D object by 3D calculation in the segmented images. The generated 3D object was smoothing by remeshing operation and smoothing operation factor was 0.4, iteration value was 5. The edge mode was selected to perform triangle reduction. The parameters were that tolerance (0.1mm), edge angle (15 degrees) and the number of iteration (5). The image processing 3D object file was converted to an STL file to output with 3D printer. We modified 3D small animal file using 3- Matic research (Materialise, Leuven, Belgium) to make space for radiation dosimetry chips. We acquired 3D object of realistic small animal phantom. The width of small animal phantom was 2.631 cm, thickness was 2.361 cm, and length was 10.817. Mimics software supported efficiency about 3D object generation and usability of conversion to STL file for user. The development of small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.Keywords: mimics, preclinical small animal, segmentation, 3D printer
Procedia PDF Downloads 3661202 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 951201 Examining the Teaching and Learning Needs of Science and Mathematics Educators in South Africa
Authors: M. Shaheed Hartley
Abstract:
There has been increasing pressure on education researchers and practitioners at higher education institutions to focus on the development of South Africa’s rural and peri-urban communities and improving their quality of life. Many tertiary institutions are obliged to review their outreach interventions in schools. To ensure that the support provided to schools is still relevant, a systemic evaluation of science educator needs is central to this process. These prioritised needs will serve as guide not only for the outreach projects of tertiary institutions, but also to service providers in general so that the process of addressing educators needs become coordinated, organised and delivered in a systemic manner. This paper describes one area of a broader needs assessment exercise to collect data regarding the needs of educators in a district of 45 secondary schools in the Western Cape Province of South Africa. This research focuses on the needs and challenges faced by science educators at these schools as articulated by the relevant stakeholders. The objectives of this investigation are two-fold: (1) to create a data base that will capture the needs and challenges identified by science educators of the selected secondary schools; and (2) to develop a needs profile for each of the participating secondary schools that will serve as a strategic asset to be shared with the various service providers as part of a community of practice whose core business is to support science educators and science education at large. The data was collected by a means of a needs assessment questionnaire (NAQ) which was developed in both actual and preferred versions. An open-ended questionnaire was also administered which allowed teachers to express their views. The categories of the questionnaire were predetermined by participating researchers, educators and education department officials. Group interviews were also held with the science teachers at each of the schools. An analysis of the data revealed important trends in terms of science educator needs and identified schools that can be clustered around priority needs, logistic reasoning and educator profiles. The needs database also provides opportunity for the community of practice to strategise and coordinate their interventions.Keywords: needs assessment, science and mathematics education, evaluation, teaching and learning, South Africa
Procedia PDF Downloads 184