Search results for: continuous monitoring tool
7575 Stories of Digital Technology and Online Safety: Storytelling as a Tool to Find out Young Children’s Views on Digital Technology and Online Safety
Authors: Lindsey Watson
Abstract:
This research is aimed at facilitating and listening to the voices of younger children, recognising their contributions to research about the things that matter to them. Digital technology increasingly impacts on the lives of young children, therefore this study aimed at increasing children’s agency through recognising and involving their perspectives to help contribute to a wider understanding of younger children’s perceptions of online safety. Using a phenomenological approach, the paper discusses how storytelling as a creative methodological approach enabled an agentic space for children to express their views, knowledge, and perceptions of their engagement with the digital world. Setting and parental informed consent were gained in addition to an adapted approach to child assent through the use of child-friendly language and emoji stickers, which was also recorded verbally. Findings demonstrate that younger children are thinking about many aspects of digital technology and how this impacts on their lives and that storytelling as a research method is a useful tool to facilitate conversations with young children. The paper thus seeks to recognise and evaluate how creative methodologies can provide insights into children’s understanding of online safety and how this can influence practitioners and parents in supporting younger children in a digital world.Keywords: early childhood, family, online safety, phenomenology, storytelling
Procedia PDF Downloads 1307574 Serological IgG Testing to Diagnose Alimentary Induced Diseases and Monitoring Efficacy of an Individual Defined Diet in Dogs
Authors: Anne-Margré C. Vink
Abstract:
Background: Food-related allergies and intolerances are frequently occurring in dogs. Diagnosis and monitoring according to ‘Golden Standard’ of elimination efficiency are time-consuming, expensive, and requires expert clinical setting. In order to facilitate rapid and robust, quantitative testing of intolerance, and determining the individual offending foods, a serological test is implicated. Method: As we developed Medisynx IgG Human Screening Test ELISA before and the dog’s immune system is most similar to humans, we were able to develop Medisynx IgG Dog Screening Test ELISA as well. In this study, 47 dogs suffering from Canine Atopic Dermatitis (CAD) and several secondary induced reactions were included to participate in serological Medisynx IgG Dog Screening Test ELISA (within < 0,02 % SD). Results were expressed as titers relative to the standard OD readings to diagnose alimentary induced diseases and monitoring the efficacy of an individual eliminating diet in dogs. Split sample analysis was performed by independently sending 2 times 3 ml serum under two unique codes. Results: The veterinarian monitored these dogs to check dog’ results at least at 3, 7, 21, 49, 70 days and after period of 6 and 12 months on an individual negative diet and a positive challenge (retrospectively) at 6 months. Data of each dog were recorded in a screening form and reported that a complete recovery of all clinical manifestations was observed at or less than 70 days (between 50 and 70 days) in the majority of dogs(44 out of 47 dogs =93.6%). Conclusion: Challenge results showed a significant result of 100% in specificity as well as 100% positive predicted value. On the other hand, sensitivity was 95,7% and negative predictive value was 95,7%. In conclusion, an individual diet based on IgG ELISA in dogs provides a significant improvement of atopic dermatitis and pruritus including all other non-specific defined allergic skin reactions as erythema, itching, biting and gnawing at toes, as well as to several secondary manifestations like chronic diarrhoea, chronic constipation, otitis media, obesity, laziness or inactive behaviour, pain and muscular stiffness causing a movement disorders, excessive lacrimation, hyper behaviour, nervous behaviour and not possible to stay alone at home, anxiety, biting and aggressive behaviour and disobedience behaviour. Furthermore, we conclude that a relatively more severe systemic candidiasis, as shown by relatively higher titer (class 3 and 4 IgG reactions to Candida albicans), influence the duration of recovery from clinical manifestations in affected dogs. These findings are consistent with our preliminary human clinical studies.Keywords: allergy, canine atopic dermatitis, CAD, food allergens, IgG-ELISA, food-incompatibility
Procedia PDF Downloads 3247573 The Misuse of Free Cash and Earnings Management: An Analysis of the Extent to Which Board Tenure Mitigates Earnings Management
Authors: Michael McCann
Abstract:
Managerial theories propose that, in joint stock companies, executives may be tempted to waste excess free cash on unprofitable projects to keep control of resources. In order to conceal their projects' poor performance, they may seek to engage in earnings management. On the one hand, managers may manipulate earnings upwards in order to post ‘good’ performances and safeguard their position. On the other, since managers pursuit of unrewarding investments are likely to lead to low long-term profitability, managers will use negative accruals to reduce current year’s earnings, smoothing earnings over time in order to conceal the negative effects. Agency models argue that boards of directors are delegated by shareholders to ensure that companies are governed properly. Part of that responsibility is ensuring the reliability of financial information. Analyses of the impact of board characteristics, particularly board independence on the misuse of free cash flow and earnings management finds conflicting evidence. However, existing characterizations of board independence do not account for such directors gaining firm-specific knowledge over time, influencing their monitoring ability. Further, there is little analysis of the influence of the relative experience of independent directors and executives on decisions surrounding the use of free cash. This paper contributes to this literature regarding the heterogeneous characteristics of boards by investigating the influence of independent director tenure on earnings management and the relative tenures of independent directors and Chief Executives. A balanced panel dataset comprising 51 companies across 11 annual periods from 2005 to 2015 is used for the analysis. In each annual period, firms were classified as conducting earnings management if they had discretionary accruals in the bottom quartile (downwards) and top quartile (upwards) of the distributed values for the sample. Logistical regressions were conducted to determine the marginal impact of independent board tenure and a number of control variables on the probability of conducting earnings management. The findings indicate that both absolute and relative measures of board independence and experience do not have a significant impact on the likelihood of earnings management. It is the level of free cash flow which is the major influence on the probability of earnings management. Higher free cash flow increases the probability of earnings management significantly. The research also investigates whether board monitoring of earnings management is contingent on the level of free cash flow. However, the results suggest that board monitoring is not amplified when free cash flow is higher. This suggests that the extent of earnings management in companies is determined by a range of company, industry and situation-specific factors.Keywords: corporate governance, boards of directors, agency theory, earnings management
Procedia PDF Downloads 2377572 An Agile, Intelligent and Scalable Framework for Global Software Development
Authors: Raja Asad Zaheer, Aisha Tanveer, Hafza Mehreen Fatima
Abstract:
Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.Keywords: agile project management, agile tools/techniques, distributed teams, global software development
Procedia PDF Downloads 3217571 The Hotel Logging Behavior and Factors of Tourists in Bankontee District, Samut Songkhram Province, Thailand
Authors: Aticha Kwaengsopha
Abstract:
The purpose of this research was to study the behaviour and related factors that tourists utilized for making decisions to choose their accommodations at a tourist destination, Bangkontee district, Samut Songkhran Province, Thailand. The independent variables included gender, age, income, occupation, and region, while the three important dependent variables included selection behaviour, factors related selection process, and satisfaction of the accommodation service. A total of 400 Thai and international tourists were interviewed at tourist destination of Bangkontee. A questionnaire was used as the tool for collecting data. Descriptive statistics in this research included percentage, mean, and standard deviation. The findings revealed that the majority of respondents were single, female, and with the age between 23-30 years old. Most of the international tourists were from Asia and planned to stay in Thailand about 1-6 days. In addition, the majority of tourists preferred to travel in small groups of 3 persons. The majority of respondents used internet and word of mouth as the main tool to search for information. The majority of respondents spent most of their budget on food & drink, accommodation, and travelling. Even though the majority of tourists were satisfied with the quality of accommodation, the price range of accommodation, and the image of accommodation and the facilities of the accommodation, they indicated that they were not likely to re-visit Thailand in the near future.Keywords: behaviour, decision factors, tourists, media engineering
Procedia PDF Downloads 2787570 Audit and Assurance Program for AI-Based Technologies
Authors: Beatrice Arthur
Abstract:
The rapid development of artificial intelligence (AI) has transformed various industries, enabling faster and more accurate decision-making processes. However, with these advancements come increased risks, including data privacy issues, systemic biases, and challenges related to transparency and accountability. As AI technologies become more integrated into business processes, there is a growing need for comprehensive auditing and assurance frameworks to manage these risks and ensure ethical use. This paper provides a literature review on AI auditing and assurance programs, highlighting the importance of adapting traditional audit methodologies to the complexities of AI-driven systems. Objective: The objective of this review is to explore current AI audit practices and their role in mitigating risks, ensuring accountability, and fostering trust in AI systems. The study aims to provide a structured framework for developing audit programs tailored to AI technologies while also investigating how AI impacts governance, risk management, and regulatory compliance in various sectors. Methodology: This research synthesizes findings from academic publications and industry reports from 2014 to 2024, focusing on the intersection of AI technologies and IT assurance practices. The study employs a qualitative review of existing audit methodologies and frameworks, particularly the COBIT 2019 framework, to understand how audit processes can be aligned with AI governance and compliance standards. The review also considers real-time auditing as an emerging necessity for influencing AI system design during early development stages. Outcomes: Preliminary findings indicate that while AI auditing is still in its infancy, it is rapidly gaining traction as both a risk management strategy and a potential driver of business innovation. Auditors are increasingly being called upon to develop controls that address the ethical and operational risks posed by AI systems. The study highlights the need for continuous monitoring and adaptable audit techniques to handle the dynamic nature of AI technologies. Future Directions: Future research will explore the development of AI-specific audit tools and real-time auditing capabilities that can keep pace with evolving technologies. There is also a need for cross-industry collaboration to establish universal standards for AI auditing, particularly in high-risk sectors like healthcare and finance. Further work will involve engaging with industry practitioners and policymakers to refine the proposed governance and audit frameworks. Funding/Support Acknowledgements: This research is supported by the Information Systems Assurance Management Program at Concordia University of Edmonton.Keywords: AI auditing, assurance, risk management, governance, COBIT 2019, transparency, accountability, machine learning, compliance
Procedia PDF Downloads 297569 The Development of Online-Class Scheduling Management System Conducted by the Case Study of Department of Social Science: Faculty of Humanities and Social Sciences Suan Sunandha Rajabhat University
Authors: Wipada Chaiwchan, Patcharee Klinhom
Abstract:
This research is aimed to develop the online-class scheduling management system and improve as a complex problem solution, this must take into consideration in various conditions and factors. In addition to the number of courses, the number of students and a timetable to study, the physical characteristics of each class room and regulations used in the class scheduling must also be taken into consideration. This system is developed to assist management in the class scheduling for convenience and efficiency. It can provide several instructors to schedule simultaneously. Both lecturers and students can check and publish a timetable and other documents associated with the system online immediately. It is developed in a web-based application. PHP is used as a developing tool. The database management system was MySQL. The tool that is used for efficiency testing of the system is questionnaire. The system was evaluated by using a Black-Box testing. The sample was composed of 2 groups: 5 experts and 100 general users. The average and the standard deviation of results from the experts were 3.50 and 0.67. The average and the standard deviation of results from the general users were 3.54 and 0.54. In summary, the results from the research indicated that the satisfaction of users was in a good level. Therefore, this system could be implemented in an actual workplace and satisfy the users’ requirement effectivelyKeywords: timetable, schedule, management system, online
Procedia PDF Downloads 2407568 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 1267567 Block Implicit Adams Type Algorithms for Solution of First Order Differential Equation
Authors: Asabe Ahmad Tijani, Y. A. Yahaya
Abstract:
The paper considers the derivation of implicit Adams-Moulton type method, with k=4 and 5. We adopted the method of interpolation and collocation of power series approximation to generate the continuous formula which was evaluated at off-grid and some grid points within the step length to generate the proposed block schemes, the schemes were investigated and found to be consistent and zero stable. Finally, the methods were tested with numerical experiments to ascertain their level of accuracy.Keywords: Adam-Moulton Type (AMT), off-grid, block method, consistent and zero stable
Procedia PDF Downloads 4847566 Mitigating Food Insecurity and Malnutrition by Promoting Carbon Farming via a Solar-Powered Enzymatic Composting Bioreactor with Arduino-Based Sensors
Authors: Molin A., De Ramos J. M., Cadion L. G., Pico R. L.
Abstract:
Malnutrition and food insecurity represent significant global challenges affecting millions of individuals, particularly in low-income and developing regions. The researchers created a solar-powered enzymatic composting bioreactor with an Arduino-based monitoring system for pH, humidity, and temperature. It manages mixed municipal solid wastes incorporating industrial enzymes and whey additives for accelerated composting and minimized carbon footprint. Within 15 days, the bioreactor yielded 54.54% compost compared to 44.85% from traditional methods, increasing yield by nearly 10%. Tests showed that the bioreactor compost had 4.84% NPK, passing metal analysis standards, while the traditional pit compost had 3.86% NPK; both are suitable for agriculture. Statistical analyses, including ANOVA and Tukey's HSD test, revealed significant differences in agricultural yield across different compost types based on leaf length, width, and number of leaves. The study compared the effects of different composts on Brassica rapa subsp. Chinesis (Petchay) and Brassica juncea (Mustasa) plant growth. For Pechay, significant effects of compost type on plant leaf length (F(5,84) = 62.33, η² = 0.79) and leaf width (F(5,84) = 12.35, η² = 0.42) were found. For Mustasa, significant effects of compost type on leaf length (F(4,70) = 20.61, η² = 0.54), leaf width (F(4,70) = 19.24, η² = 0.52), and number of leaves (F(4,70) = 13.17, η² = 0.43) were observed. This study explores the effectiveness of the enzymatic composting bioreactor and its viability in promoting carbon farming as a solution to food insecurity and malnutrition.Keywords: malnutrition, food insecurity, enzymatic composting bioreactor, arduino-based monitoring system, enzymes, carbon farming, whey additive, NPK level
Procedia PDF Downloads 617565 Ground Track Assessment Using Electrical Resistivity Tomography Application
Authors: Noryani Natasha Yahaya, Anas Ibrahim, Juraidah Ahmad, Azura Ahmad, Mohd Ikmal Fazlan Rosli, Zailan Ramli, Muhd Sidek Muhd Norhasri
Abstract:
The subgrade formation is an important element of the railway structure which holds overall track stability. Conventional track maintenance involves many substructure component replacements, as well as track re-ballasting on a regular basis is partially contributed to the embankment's long-term settlement problem. For subgrade long-term stability analysis, the geophysical method is commonly being used to diagnose those hidden sources/mechanisms of track deterioration problems that the normal visual method is unable to detect. Electrical resistivity tomography (ERT) is one of the applicable geophysical tools that are helpful in railway subgrade inspection/track monitoring due to its flexibility and reliability of the analysis. The ERT was conducted at KM 23.0 of Pinang Tunggal track to investigate the subgrade of railway track through the characterization/mapping on track formation profiling which was directly generated using 2D analysis of Res2dinv software. The profiles will allow examination of the presence and spatial extent of a significant subgrade layer and screening of any poor contact of soil boundary. Based on the finding, there is a mix/interpretation/intermixing of an interlayer between the sub-ballast and the sand. Although the embankment track considered here is at no immediate risk of settlement effect or any failure, the regular monitoring of track’s location will allow early correction maintenance if necessary. The developed data of track formation clearly shows the similarity of the side view with the assessed track. The data visualization in the 2D section of the track embankment agreed well with the initial assumption based on the main element structure general side view.Keywords: ground track, assessment, resistivity, geophysical railway, method
Procedia PDF Downloads 1627564 A Case Study Comparing the Effect of Computer Assisted Task-Based Language Teaching and Computer-Assisted Form Focused Language Instruction on Language Production of Students Learning Arabic as a Foreign Language
Authors: Hanan K. Hassanein
Abstract:
Task-based language teaching (TBLT) and focus on form instruction (FFI) methods were proven to improve quality and quantity of immediate language production. However, studies that compare between the effectiveness of the language production when using TBLT versus FFI are very little with results that are not consistent. Moreover, teaching Arabic using TBLT is a new field with few research that has investigated its application inside classrooms. Furthermore, to the best knowledge of the researcher, there are no prior studies that compared teaching Arabic as a foreign language in a classroom setting using computer-assisted task-based language teaching (CATBLT) with computer-assisted form focused language instruction (CAFFI). Accordingly, the focus of this presentation is to display CATBLT and CAFFI tools when teaching Arabic as a foreign language as well as demonstrate an experimental study that aims to identify whether or not CATBLT is a more effective instruction method. The effectiveness will be determined through comparing CATBLT and CAFFI in terms of accuracy, lexical complexity, and fluency of language produced by students. The participants of the study are 20 students enrolled in two intermediate-level Arabic as a foreign language classes. The experiment will take place over the course of 7 days. Based on a study conducted by Abdurrahman Arslanyilmaz for teaching Turkish as a second language, an in-house computer assisted tool for the TBLT and another one for FFI will be designed for the experiment. The experimental group will be instructed using the in-house CATBLT tool and the control group will be taught through the in-house CAFFI tool. The data that will be analyzed are the dialogues produced by students in both the experimental and control groups when completing a task or communicating in conversational activities. The dialogues of both groups will be analyzed to understand the effect of the type of instruction (CATBLT or CAFFI) on accuracy, lexical complexity, and fluency. Thus, the study aims to demonstrate whether or not there is an instruction method that positively affects the language produced by students learning Arabic as a foreign language more than the other.Keywords: computer assisted language teaching, foreign language teaching, form-focused instruction, task based language teaching
Procedia PDF Downloads 2547563 Saving Energy through Scalable Architecture
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change
Procedia PDF Downloads 1117562 Comparison of Two Home Sleep Monitors Designed for Self-Use
Authors: Emily Wood, James K. Westphal, Itamar Lerner
Abstract:
Background: Polysomnography (PSG) recordings are regularly used in research and clinical settings to study sleep and sleep-related disorders. Typical PSG studies are conducted in professional laboratories and performed by qualified researchers. However, the number of sleep labs worldwide is disproportionate to the increasing number of individuals with sleep disorders like sleep apnea and insomnia. Consequently, there is a growing need to supply cheaper yet reliable means to measure sleep, preferably autonomously by subjects in their own home. Over the last decade, a variety of devices for self-monitoring of sleep became available in the market; however, very few have been directly validated against PSG to demonstrate their ability to perform reliable automatic sleep scoring. Two popular mobile EEG-based systems that have published validation results, the DREEM 3 headband and the Z-Machine, have never been directly compared one to the other by independent researchers. The current study aimed to compare the performance of DREEM 3 and the Z-Machine to help investigators and clinicians decide which of these devices may be more suitable for their studies. Methods: 26 participants have completed the study for credit or monetary compensation. Exclusion criteria included any history of sleep, neurological or psychiatric disorders. Eligible participants arrived at the lab in the afternoon and received the two devices. They then spent two consecutive nights monitoring their sleep at home. Participants were also asked to keep a sleep log, indicating the time they fell asleep, woke up, and the number of awakenings occurring during the night. Data from both devices, including detailed sleep hypnograms in 30-second epochs (differentiating Wake, combined N1/N2, N3; and Rapid Eye Movement sleep), were extracted and aligned upon retrieval. For analysis, the number of awakenings each night was defined as four or more consecutive wake epochs between sleep onset and termination. Total sleep time (TST) and the number of awakenings were compared to subjects’ sleep logs to measure consistency with the subjective reports. In addition, the sleep scores from each device were compared epoch-by-epoch to calculate the agreement between the two devices using Cohen’s Kappa. All analysis was performed using Matlab 2021b and SPSS 27. Results/Conclusion: Subjects consistently reported longer times spent asleep than the time reported by each device (M= 448 minutes for sleep logs compared to M= 406 and M= 345 minutes for the DREEM and Z-Machine, respectively; both ps<0.05). Linear correlations between the sleep log and each device were higher for the DREEM than the Z-Machine for both TST and the number of awakenings, and, likewise, the mean absolute bias between the sleep logs and each device was higher for the Z-Machine for both TST (p<0.001) and awakenings (p<0.04). There was some indication that these effects were stronger for the second night compared to the first night. Epoch-by-epoch comparisons showed that the main discrepancies between the devices were for detecting N2 and REM sleep, while N3 had a high agreement. Overall, the DREEM headband seems superior for reliably scoring sleep at home.Keywords: DREEM, EEG, seep monitoring, Z-machine
Procedia PDF Downloads 1087561 The Effects of 6-Weeks Aerobic Dance among Women
Authors: Mohd Faridz Ahmad, Muhammad Amir Asyraf Rosli
Abstract:
Aerobic dance has becoming a popular mode of exercise especially among women due to its fun nature. With a catchy music background and joyful dance steps, aerobic dancers would be able to have fun while sweating out. Depending on its level of aggressiveness, aerobic may also improve and maintain cardiorespiratory fitness other than being a great tool for weight loss. This study intends to prove that aerobic dance activity can bring the same, if not better impacts on health than other types of cardiovascular exercise such as jogging and cycling. The objective of this study was to evaluate and identify the effect of six weeks aerobic dance on cardiovascular fitness and weight loss among women. This study, which was held in Seremban Fit Challenge, used a quasi-experimental design. The subjects selected include a total of 14 women (n = 14) with age (32.4 years old ± 9.1), weight (65.93 kg ± 11.24) and height (165.36 ± 3.46) who joined the Seremban Fit Challenge Season 13. The subjects were asked to join an aerobic dance class with duration of one hour for six weeks in a row. As for the outcome, cardiovascular fitness was measured with a 1-mile run test while any changes on weight was measured using the weighing scale. The result showed that there was a significant difference between pre and post-test for cardiovascular fitness when p = 0.02 < 0.05 and weight loss when p = 0.00 < 0.05. In conclusion, a six-week long aerobic dance program would have a positive effect on cardiovascular fitness and weight. Therefore, aerobic dance may be used as an alternative tool for people who wish to lead a healthy lifestyle in a fun way.Keywords: aerobic dance, cardiovascular fitness, weight loss, 1-mile run test
Procedia PDF Downloads 5477560 Secondary Charged Fragments Tracking for On-Line Beam Range Monitoring in Particle Therapy
Authors: G. Traini, G. Battistoni, F. Collamati, E. De Lucia, R. Faccini, C. Mancini-Terracciano, M. Marafini, I. Mattei, S. Muraro, A. Sarti, A. Sciubba, E. Solfaroli Camillocci, M. Toppi, S. M. Valle, C. Voena, V. Patera
Abstract:
In Particle Therapy (PT) treatments a large amount of secondary particles, whose emission point is correlated to the dose released in the crossed tissues, is produced. The measurement of the secondary charged fragments component could represent a valid technique to monitor the beam range during the PT treatments, that is a still missing item in the clinical practice. A sub-millimetrical precision on the beam range measurement is required to significantly optimise the technique and to improve the treatment quality. In this contribution, a detector, named Dose Profiler (DP), is presented. It is specifically planned to monitor on-line the beam range exploiting the secondary charged particles produced in PT Carbon ions treatment. In particular, the DP is designed to track the secondary fragments emitted at large angles with respect to the beam direction (mainly protons), with the aim to reconstruct the spatial coordinates of the fragment emission point extrapolating the measured track toward the beam axis. The DP is currently under development within of the INSIDE collaboration (Innovative Solutions for In-beam Dosimetry in hadrontherapy). The tracker is made by six layers (20 × 20 cm²) of BCF-12 square scintillating fibres (500 μm) coupled to Silicon Photo-Multipliers, followed by two plastic scintillator layers of 6 mm thickness. A system of front-end boards based on FPGAs arranged around the detector provides the data acquisition. The detector characterization with cosmic rays is currently undergoing, and a data taking campaign with protons will take place in May 2017. The DP design and the performances measured with using MIPs and protons beam will be reviewed.Keywords: fragmentation, monitoring, particle therapy, tracking
Procedia PDF Downloads 2367559 Analysis of Hard Turning Process of AISI D3-Thermal Aspects
Authors: B. Varaprasad, C. Srinivasa Rao
Abstract:
In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of hard turning by using commercial software DEFORM 3D has been compared to experimental results of stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.Keywords: hard turning, computer aided engineering, computational machining, finite element method
Procedia PDF Downloads 4587558 Community-based Mapping as a Planning Tool; Examples from Pakistan
Authors: Noman Ahmed, Fariha Tahseen
Abstract:
Since several decades, unplanned urbanization and rapid growth of informal settlements have evolved and increased in size and number. Large cities such as Karachi have been impacted with sprawl and rising share of unplanned settlements where poor communities reside. Threats of eviction, deteriorating law and order situation, lack of essential amenities and infrastructure, extortion and bullying from local and non-local musclemen and feeble response of government agencies towards their development needs are some predicaments. Non-governmental organizations (NGOs) have caused important interventions in such locations. Appraisal of the community-based mapping as a tool in supporting the development work in less privileged areas in Karachi has been the objective of this research. The Orangi Pilot Project (OPP), under the leadership of its slain director Perween Rahman had a significant role to play in developing and extending this approach in low income locations in Karachi and beyond. The paper investigates the application of mapping in the process of peri urban land invasion causing rapid transformation of traditional settlements in Karachi. Mixed methodology components comprising literature review, archival research, and unstructured interviews with key informants and case studies have been used.Keywords: squatters (katchi abadis), land grabbing, community empowerment, housing rights, mapping, infrastructure development
Procedia PDF Downloads 3157557 Biophysically Motivated Phylogenies
Authors: Catherine Felce, Lior Pachter
Abstract:
Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.Keywords: phylogenetics, single-cell, biophysical modeling, transcription
Procedia PDF Downloads 597556 Optimising Participation in Physical Activity Research for Adults with Intellectual Disabilities
Authors: Yetunde M. Dairo, Johnny Collett, Helen Dawes
Abstract:
Background and Aim: Engagement with physical activity (PA) research is poor among adults with intellectual disabilities (ID), particularly in those from residential homes. This study explored why, by asking managers of residential homes, adults with ID and their carers. Methods: Participants: A convenient sample of 23 individuals from two UK local authorities, including a group of ID residential home managers, adults with ID and their support staff. Procedures: A) Residential home managers (n=6) were asked questions about their willingness to allow their residents to participate in PA research; B) eleven adults with ID and their support workers (n=6) were asked questions about their willingness to accept 7-day accelerometer monitoring and/or the International Physical Activity Questionnaire-short version (IPAQ-s) as PA measures. The IPAQ-s was administered by the researcher and they were each provided with samples of accelerometers to try on. Results: A) Five out of six managers said that the burden of wearing the accelerometer for seven days would be too high for the people they support, the majority of whom might be unable to express their wishes. They also said they would be unwilling to act as proxy respondents for the same reason. Additionally, they cited time pressure, understaffing, and reluctance to spend time on the research paperwork as further reasons for non-participation. B) All 11 individuals with ID completed the IPAQ-s while only three accepted the accelerometer, one of whom was deemed inappropriate to wear it. Reasons for rejecting accelerometers included statements from participants of: ‘too expensive’, ‘too heavy’, ‘uncomfortable’, and two people said they would not want to wear it for more than one day. All adults with ID (11) and their support workers (6) provided information about their physical activity levels through the IPAQ-s. Conclusions: Care home managers are a barrier to research participation. However, adults with ID would be happy for the IPAQ-s as a PA measure, but less so for the 7-day accelerometer monitoring. In order to improve participation in this population, the choice of PA measure is considered important. Moreover, there is a need for studies exploring how best to engage ID residential home managers in PA research.Keywords: intellectual disability, physical activity measurement, research engagement, research participation
Procedia PDF Downloads 3117555 Effects of Tenefovir Disiproxil Fumarate on the Renal Sufficiency of HIV Positive Patients
Authors: Londeka Ntuli, Frasia Oosthuizen
Abstract:
Background: Tenefovir disiproxil fumarate (TDF) is a nephrotoxic drug and has been proven to contribute to renal insufficiency necessitating intensive monitoring and management of adverse effects arising from prolonged exposure to the drug. TDF is one of the preferred first-line drugs used in combination therapy in most regions. There are estimated 300 000 patients being initiated on the Efavirenz/TDF/Emtricitabine first-line regimen annually in South Africa. It is against this background that this study aims to investigate the effects of TDF on renal sufficiency of HIV positive patients. Methodology: A retrospective quantitative study was conducted, analysing clinical charts of HIV positive patient’s older than 18 years of age and on a TDF-containing regimen for more than 1 year. Data were obtained from the analysis of patient files and was transcribed into Microsoft® Excel® spreadsheet. Extracted data were coded, categorised and analysed using STATA®. Results: A total of 275 patient files were included in this study. Renal function started decreasing after 3 months of treatment (with 93.5% patients having a normal EGFR), and kept on decreasing as time progressed with only 39.6% normal renal function at year 4. Additional risk factors for renal insufficiency included age below 25, female gender, and additional medication. Conclusion: It is clear from this study that the use of TDF necessitates intensive monitoring and management of adverse effects arising from prolonged exposure to the drug. The findings from this study generated pertinent information on the safety profile of the drug TDF in a resource-limited setting of a public health institution. The appropriate management is of tremendous importance in the South African context where the majority of HIV positive individuals are on the TDF containing regimen; thus it is beneficial to ascertain the possible level of toxicities these patients may be experiencing.Keywords: renal insufficiency, tenefovir, HIV, risk factors
Procedia PDF Downloads 1237554 Fluorescence-Based Biosensor for Dopamine Detection Using Quantum Dots
Authors: Sylwia Krawiec, Joanna Cabaj, Karol Malecha
Abstract:
Nowadays, progress in the field of the analytical methods is of great interest for reliable biological research and medical diagnostics. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements. Chemical sensors have displaced the conventional analytical methods - sensors combine precision, sensitivity, fast response and the possibility of continuous-monitoring. Biosensor is a chemical sensor, which except of conventer also possess a biologically active material, which is the basis for the detection of specific chemicals in the sample. Each biosensor device mainly consists of two elements: a sensitive element, where is recognition of receptor-analyte, and a transducer element which receives the signal and converts it into a measurable signal. Through these two elements biosensors can be divided in two categories: due to the recognition element (e.g immunosensor) and due to the transducer (e.g optical sensor). Working of optical sensor is based on measurements of quantitative changes of parameters characterizing light radiation. The most often analyzed parameters include: amplitude (intensity), frequency or polarization. Changes in the optical properties one of the compound which reacts with biological material coated on the sensor is analyzed by a direct method, in an indirect method indicators are used, which changes the optical properties due to the transformation of the testing species. The most commonly used dyes in this method are: small molecules with an aromatic ring, like rhodamine, fluorescent proteins, for example green fluorescent protein (GFP), or nanoparticles such as quantum dots (QDs). Quantum dots have, in comparison with organic dyes, much better photoluminescent properties, better bioavailability and chemical inertness. These are semiconductor nanocrystals size of 2-10 nm. This very limited number of atoms and the ‘nano’-size gives QDs these highly fluorescent properties. Rapid and sensitive detection of dopamine is extremely important in modern medicine. Dopamine is very important neurotransmitter, which mainly occurs in the brain and central nervous system of mammals. Dopamine is responsible for the transmission information of moving through the nervous system and plays an important role in processes of learning or memory. Detection of dopamine is significant for diseases associated with the central nervous system such as Parkinson or schizophrenia. In developed optical biosensor for detection of dopamine, are used graphene quantum dots (GQDs). In such sensor dopamine molecules coats the GQD surface - in result occurs quenching of fluorescence due to Resonance Energy Transfer (FRET). Changes in fluorescence correspond to specific concentrations of the neurotransmitter in tested sample, so it is possible to accurately determine the concentration of dopamine in the sample.Keywords: biosensor, dopamine, fluorescence, quantum dots
Procedia PDF Downloads 3687553 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 1317552 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies
Authors: Rituparna Nath, Shawn J. Marshall
Abstract:
Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age
Procedia PDF Downloads 2727551 Study of the Kinetics of Formation of Carboxylic Acids Using Ion Chromatography during Oxidation Induced by Rancimat of the Oleic Acid, Linoleic Acid, Linolenic Acid, and Biodiesel
Authors: Patrícia T. Souza, Marina Ansolin, Eduardo A. C. Batista, Antonio J. A. Meirelles, Matthieu Tubino
Abstract:
Lipid oxidation is a major cause of the deterioration of the quality of the biodiesel, because the waste generated damages the engines. Among the main undesirable effects are the increase of viscosity and acidity, leading to the formation of insoluble gums and sediments which cause the blockage of fuel filters. The auto-oxidation is defined as the spontaneous reaction of atmospheric oxygen with lipids. Unsaturated fatty acids are usually the components affected by such reactions. They are present as free fatty acids, fatty esters and glycerides. To determine the oxidative stability of biodiesels, through the induction period, IP, the Rancimat method is used, which allows continuous monitoring of the induced oxidation process of the samples. During the oxidation of the lipids, volatile organic acids are produced as byproducts, in addition, other byproducts, including alcohols and carbonyl compounds, may be further oxidized to carboxylic acids. By the methodology developed in this work using ion chromatography, IC, analyzing the water contained in the conductimetric vessel, were quantified organic anions of carboxylic acids in samples subjected to oxidation induced by Rancimat. The optimized chromatographic conditions were: eluent water:acetone (80:20 v/v) with 0.5 mM sulfuric acid; flow rate 0.4 mL min-1; injection volume 20 µL; eluent suppressor 20 mM LiCl; analytical curve from 1 to 400 ppm. The samples studied were methyl biodiesel from soybean oil and unsaturated fatty acids standards: oleic, linoleic and linolenic. The induced oxidation kinetics curves were constructed by analyzing the water contained in the conductimetric vessels which were removed, each one, from the Rancimat apparatus at prefixed intervals of time. About 3 g of sample were used under the conditions of 110 °C and air flow rate of 10 L h-1. The water of each conductimetric Rancimat measuring vessel, where the volatile compounds were collected, was filtered through a 0.45 µm filter and analyzed by IC. Through the kinetic data of the formation of the organic anions of carboxylic acids, the formation rates of the same were calculated. The observed order of the rates of formation of the anions was: formate >>> acetate > hexanoate > valerate for the oleic acid; formate > hexanoate > acetate > valerate for the linoleic acid; formate >>> valerate > acetate > propionate > butyrate for the linolenic acid. It is possible to suppose that propionate and butyrate are obtained mainly from linolenic acid and that hexanoate is originated from oleic and linoleic acid. For the methyl biodiesel the order of formation of anions was: formate >>> acetate > valerate > hexanoate > propionate. According to the total rate of formation these anions produced during the induced degradation of the fatty acids can be assigned the order of reactivity: linolenic acid > linoleic acid >>> oleic acid.Keywords: anions of carboxylic acids, biodiesel, ion chromatography, oxidation
Procedia PDF Downloads 4757550 Coastal Modelling Studies for Jumeirah First Beach Stabilization
Authors: Zongyan Yang, Gagan K. Jena, Sankar B. Karanam, Noora M. A. Hokal
Abstract:
Jumeirah First beach, a segment of coastline of length 1.5 km, is one of the popular public beaches in Dubai, UAE. The stability of the beach has been affected by several coastal developmental projects, including The World, Island 2 and La Mer. A comprehensive stabilization scheme comprising of two composite groynes (of lengths 90 m and 125m), modification to the northern breakwater of Jumeirah Fishing Harbour and beach re-nourishment was implemented by Dubai Municipality in 2012. However, the performance of the implemented stabilization scheme has been compromised by La Mer project (built in 2016), which modified the wave climate at the Jumeirah First beach. The objective of the coastal modelling studies is to establish design basis for further beach stabilization scheme(s). Comprehensive coastal modelling studies had been conducted to establish the nearshore wave climate, equilibrium beach orientations and stable beach plan forms. Based on the outcomes of the modeling studies, recommendation had been made to extend the composite groynes to stabilize the Jumeirah First beach. Wave transformation was performed following an interpolation approach with wave transformation matrixes derived from simulations of a possible range of wave conditions in the region. The Dubai coastal wave model is developed with MIKE21 SW. The offshore wave conditions were determined from PERGOS wave data at 4 offshore locations with consideration of the spatial variation. The lateral boundary conditions corresponding to the offshore conditions, at Dubai/Abu Dhabi and Dubai Sharjah borders, were derived with application of LitDrift 1D wave transformation module. The Dubai coastal wave model was calibrated with wave records at monitoring stations operated by Dubai Municipality. The wave transformation matrix approach was validated with nearshore wave measurement at a Dubai Municipality monitoring station in the vicinity of the Jumeirah First beach. One typical year wave time series was transformed to 7 locations in front of the beach to count for the variation of wave conditions which are affected by adjacent and offshore developments. Equilibrium beach orientations were estimated with application of LitDrift by finding the beach orientations with null annual littoral transport at the 7 selected locations. The littoral transport calculation results were compared with beach erosion/accretion quantities estimated from the beach monitoring program (twice a year including bathymetric and topographical surveys). An innovative integral method was developed to outline the stable beach plan forms from the estimated equilibrium beach orientations, with predetermined minimum beach width. The optimal lengths for the composite groyne extensions were recommended based on the stable beach plan forms.Keywords: composite groyne, equilibrium beach orientation, stable beach plan form, wave transformation matrix
Procedia PDF Downloads 2667549 Relationship between the Development of Sepsis, Systemic Inflammatory Response Syndrome and Body Mass Index among Adult Trauma Patients at University Hospital in Cairo
Authors: Mohamed Hendawy Mousa, Warda Youssef Mohamed Morsy
Abstract:
Background: Sepsis is a major cause of mortality and morbidity in trauma patients. Body mass index as an indicator of nutritional status was reported as a predictor of injury pattern and complications among critically ill injured patients. Aim: The aim of this study is to investigate the relationship between body mass index and the development of sepsis, systemic inflammatory response syndrome among adult trauma patients at emergency hospital - Cairo University. Research design: Descriptive correlational research design was utilized in the current study. Research questions: Q1. What is the body mass index profile of adult trauma patients admitted to the emergency hospital at Cairo University over a period of 6 months?, Q2. What is the frequency of systemic inflammatory response syndrome and sepsis among adult trauma patients admitted to the emergency hospital at Cairo University over a period of 6 months?, and Q3. What is the relationship between the development of sepsis, systemic inflammatory response syndrome and body mass index among adult trauma patients admitted to the emergency hospital at Cairo University over a period of 6 months?. Sample: A purposive sample of 52 adult male and female trauma patients with revised trauma score 10 to 12. Setting: The Emergency Hospital affiliated to Cairo University. Tools: Four tools were utilized to collect data pertinent to the study: Socio demographic and medical data tool, Systemic inflammatory response syndrome assessment tool, Revised Trauma Score tool, and Sequential organ failure assessment tool. Results: The current study revealed that, (61.5 %) of the studied subjects had normal body mass index, (25 %) were overweight, and (13.5 %) were underweight. 84.6% of the studied subjects had systemic inflammatory response syndrome and 92.3% were suffering from mild sepsis. No significant statistical relationship was found between body mass index and occurrence of Systemic inflammatory response syndrome (2= 2.89 & P = 0.23). However, Sequential organ failure assessment scores were affected significantly by body mass index was found mean of initial and last Sequential organ failure assessment score for underweight, normal and obese where t= 7.24 at p = 0.000, t= 16.49 at p = 0.000 and t= 9.80 at p = 0.000 respectively. Conclusion: Underweight trauma patients showed significantly higher rate of developing sepsis as compared to patients with normal body weight and obese. Recommendations: based on finding of this study the following are recommended: replication of the study on a larger probability sample from different geographical locations in Egypt; Carrying out of further studies in order to assess the other risk factors influencing trauma outcome and incidence of its complications; Establishment of standardized guidelines for managing underweight traumatized patients with sepsis.Keywords: body mass index, sepsis, systemic inflammatory response syndrome, adult trauma
Procedia PDF Downloads 2537548 The Impact of Perception of Transformational Leadership and Factors of Innovation Culture on Innovative Work Behavior in Junior High School's Teacher
Authors: Galih Mediana
Abstract:
Boarding school can helps students to turn all good qualities into habits. The process of forming one's personality can be done in various ways. In addition to gaining general knowledge at school during learning hours, teachers can instill values in students which can be done while in the dormitory when the learning process has ended. This shows the important role that must be played by boarding school’s teachers. Transformational leadership and a culture of innovation are things that can instill innovative behavior in teachers. This study aims to determine the effect of perceptions of transformational leadership and a culture of innovation on innovative work behavior among Islamic boarding school teachers. Respondents in this study amounted to 70 teachers. To measure transformational leadership, a modified measuring tool is used, namely the Multifactor Leadership Questionnaire (MLQ) by Bass (1985). To measure innovative work behavior, a measurement tool based on dimensions from Janssen (2000) is used. The innovation culture in this study will be measured using the innovation culture factor from Dobni (2008). This study uses multiple regression analysis to test the hypothesis. The results of this study indicate that there is an influence of perceptions of transformational leadership and innovation culture factors on innovative work behavior in Islamic boarding school’s teachers by 57.7%.Keywords: transformational leadership, innovative work behavior, innovation culture, boarding school, teacher
Procedia PDF Downloads 1127547 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water
Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer
Abstract:
Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software
Procedia PDF Downloads 857546 Setting Control Limits For Inaccurate Measurements
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: quality control, process control, round-off, measurement, rounding error
Procedia PDF Downloads 103