Search results for: time prediction algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20470

Search results for: time prediction algorithms

13210 Effect of Packing Ratio on Fire Spread across Discrete Fuel Beds: An Experimental Analysis

Authors: Qianqian He, Naian Liu, Xiaodong Xie, Linhe Zhang, Yang Zhang, Weidong Yan

Abstract:

In the wild, the vegetation layer with exceptionally complex fuel composition and heterogeneous spatial distribution strongly affects the rate of fire spread (ROS) and fire intensity. Clarifying the influence of fuel bed structure on fire spread behavior is of great significance to wildland fire management and prediction. The packing ratio is one of the key physical parameters describing the property of the fuel bed. There is a threshold value of the packing ratio for ROS, but little is known about the controlling mechanism. In this study, to address this deficiency, a series of fire spread experiments were performed across a discrete fuel bed composed of some regularly arranged laser-cut cardboards, with constant wind speed and different packing ratios (0.0125-0.0375). The experiment aims to explore the relative importance of the internal and surface heat transfer with packing ratio. The dependence of the measured ROS on the packing ratio was almost consistent with the previous researches. The data of the radiative and total heat fluxes show that the internal heat transfer and surface heat transfer are both enhanced with increasing packing ratio (referred to as ‘Stage 1’). The trend agrees well with the variation of the flame length. The results extracted from the video show that the flame length markedly increases with increasing packing ratio in Stage 1. Combustion intensity is suggested to be increased, which, in turn, enhances the heat radiation. The heat flux data shows that the surface heat transfer appears to be more important than the internal heat transfer (fuel preheating inside the fuel bed) in Stage 1. On the contrary, the internal heat transfer dominates the fuel preheating mechanism when the packing ratio further increases (referred to as ‘Stage 2’) because the surface heat flux keeps almost stable with the packing ratio in Stage 2. As for the heat convection, the flow velocity was measured using Pitot tubes both inside and on the upper surface of the fuel bed during the fire spread. Based on the gas velocity distribution ahead of the flame front, it is found that the airflow inside the fuel bed is restricted in Stage 2, which can reduce the internal heat convection in theory. However, the analysis indicates not the influence of inside flow on convection and combustion, but the decreased internal radiation of per unit fuel is responsible for the decrease of ROS.

Keywords: discrete fuel bed, fire spread, packing ratio, wildfire

Procedia PDF Downloads 126
13209 The Methodology of Hand-Gesture Based Form Design in Digital Modeling

Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim

Abstract:

As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.

Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality

Procedia PDF Downloads 356
13208 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 299
13207 Stability Optimization of NABH₄ via PH and H₂O:NABH₄ Ratios for Large Scale Hydrogen Production

Authors: Parth Mehta, Vedasri Bai Khavala, Prabhu Rajagopal, Tiju Thomas

Abstract:

There is an increasing need for alternative clean fuels, and hydrogen (H₂) has long been considered a promising solution with a high calorific value (142MJ/kg). However, the storage of H₂ and expensive processes for its generation have hindered its usage. Sodium borohydride (NaBH₄) can potentially be used as an economically viable means of H₂ storage. Thus far, there have been attempts to optimize the life of NaBH₄ (half-life) in aqueous media by stabilizing it with sodium hydroxide (NaOH) for various pH values. Other reports have shown that H₂ yield and reaction kinetics remained constant for all ratios of H₂O to NaBH₄ > 30:1, without any acidic catalysts. Here we highlight the importance of pH and H₂O: NaBH₄ ratio (80:1, 40:1, 20:1 and 10:1 by weight), for NaBH₄ stabilization (half-life reaction time at room temperature) and corrosion minimization of H₂ reactor components. It is interesting to observe that at any particular pH>10 (e.g., pH = 10, 11 and 12), the H₂O: NaBH₄ ratio does not have the expected linear dependence with stability. On the contrary, high stability was observed at the ratio of 10:1 H₂O: NaBH₄ across all pH>10. When the H₂O: NaBH₄ ratio is increased from 10:1 to 20:1 and beyond (till 80:1), constant stability (% degradation) is observed with respect to time. For practical usage (consumption within 6 hours of making NaBH₄ solution), 15% degradation at pH 11 and NaBH₄: H₂O ratio of 10:1 is recommended. Increasing this ratio demands higher NaOH concentration at the same pH, thus requiring a higher concentration or volume of acid (e.g., HCl) for H₂ generation. The reactions are done with tap water to render the results useful from an industrial standpoint. The observed stability regimes are rationalized based on complexes associated with NaBH₄ when solvated in water, which depend sensitively on both pH and NaBH₄: H₂O ratio.

Keywords: hydrogen, sodium borohydride, stability optimization, H₂O:NaBH₄ ratio

Procedia PDF Downloads 103
13206 Team Teaching, Students Perception, Challenges, and Remedies for Effective Implementation: A Case Study of the Department of Biology, Alvan Ikoku Federal College of Education, Owerri Imo State, Nigeria

Authors: Daniel Ihemtuge Akim, Micheal O. Ikeanumba

Abstract:

This research focused on team teaching; students perception, challenges, and remedies for effective implementation, a case study of the department of Biology, Alvan Ikoku Federal College of Education, Owerri Imo State, Nigeria. It seeks to address the misconception by students on the use of team teaching as a methodology for learning. Five purposes and five research questions guided this study. Descriptive survey design was used in the study. The students of biology department enrolled in both Bachelor degree and National Certificate in Education in Alvan Ikoku Federal College of Education, Owerri, formed the population size. Simple random sampling technique was used to select the sampled students and 20% of whole lecturers were selected out of the whole given sample size of three hundred and forty (340). The instrument used for data collection was structured 4 point Likert scale questionnaire and analysis was made using mean method. The result revealed that poor time management by lectures, lack of lecture venues, manpower are some of the challenges hindering the effective implementation of team teaching. It was also observed that students perform better in academic when team teaching approach is used than single teaching approach. Finally, recommendations made suggested that teachers involved in team teaching should work together with their teaching strategies and within the time frame to achieve the stated objectives.

Keywords: challenges, implementation, perception, team teaching

Procedia PDF Downloads 367
13205 An Analysis of Gamification in the Post-Secondary Classroom

Authors: F. Saccucci

Abstract:

Gamification has now started to take root in the post-secondary classroom. Educators have learned much about gamification to date but there is still a great deal to learn. One definition of gamification is the ability to engage post-secondary students with games that are fun and correlate to class room curriculum. There is no shortage of literature illustrating the advantages of gamification in the class room. This study is an extension of similar thought as well as an extension of a previous study where in class testing proved with the used of paired T-test that gamification did significantly improve the students’ understanding of subject material. Gamification itself in the class room can range from high end computer simulated software to paper based games of which both have advantages and disadvantages. This analysis used a paper based game to highlight certain qualitative advantages of gamification. The paper based game in this analysis was inexpensive, required low preparation time for the faculty member and consumed approximately 20 minutes of class room time. Data for the study was collected through in class student feedback surveys and narrative from the faculty member moderating the game. Students were randomly selected into groups of four. Qualitative advantages identified in this analysis included: 1. Students had a chance to meet, connect and know other students. 2. Students enjoyed the gamification process given there was a sense of fun and competition. 3. The post assessment that followed the simulation game was not part of their grade calculation therefore it was an opportunity to participate in a low risk activity whereby students could subsequently self-assess their understanding of the subject material. 4. In the view of the student, content knowledge did increase after the gamification process. These qualitative advantages identified in this analysis contribute to the argument that there should be an attempt to use gamification in today’s post-secondary class room. The analysis also highlighted that eighty (80) percent of the respondents believe twenty minutes devoted to the gamification process was appropriate, however twenty (20) percentage of respondents believed that rather than scheduling a gamification process and its post quiz in the last week, a review for the final exam may have been more useful. An additional study to this hopes to determine if the scheduling of the gamification had any correlation to a percentage of the students not wanting to be engaged in the process. As well, the additional study hopes to determine at what incremental level of time invested in class room gamification produce no material incremental benefits to the student as well as determine if any correlation exist between respondents preferring not to have it at the end of the semester to students not believing the gamification process added to the increase of their curricular knowledge.

Keywords: gamification, inexpensive, non-quantitative advantages, post-secondary

Procedia PDF Downloads 196
13204 Detection of Acrylamide Using Liquid Chromatography-Tandem Mass Spectrometry and Quantitative Risk Assessment in Selected Food from Saudi Market

Authors: Sarah A. Alotaibi, Mohammed A. Almutairi, Abdullah A. Alsayari, Adibah M. Almutairi, Somaiah K. Almubayedh

Abstract:

Concerns over the presence of acrylamide in food date back to 2002, when Swedish scientists stated that, in carbohydrate-rich foods, amounts of acrylamide were formed when cooked at high temperatures. Similar findings were reported by other researchers which, consequently, caused major international efforts to investigate dietary exposure and the subsequent health complications in order to properly manage this issue. Due to this issue, in this work, we aim to determine the acrylamide level in different foods (coffee, potato chips, biscuits, and baby food) commonly consumed by the Saudi population. In a total of forty-three samples, acrylamide was detected in twenty-three samples at levels of 12.3 to 2850 µg/kg. In reference to the food groups, the highest concentration of acrylamide was found in coffee samples (<12.3-2850 μg/kg), followed by potato chips (655-1310 μg/kg), then biscuits (23.5-449 μg/kg), whereas the lowest acrylamide level was observed in baby food (<14.75 – 126 μg/kg). Most coffee, biscuits and potato chips products contain high amount of acrylamide content and also the most commonly consumed product. Saudi adults had a mean exposure of acrylamide for coffee, potato, biscuit, and cereal (0.07439, 0.04794, 0.01125, 0.003371 µg/kg-b.w/day), respectively. On the other hand, exposure to acrylamide in Saudi infants and children to the same types of food was (0.1701, 0.1096, 0.02572, 0.00771 µg/kg-b.w/day), respectively. Most groups have a percentile that exceeds the tolerable daily intake (TDI) cancer value (2.6 µg/kg-b.w/day). Overall, the MOE results show that the Saudi population is at high risk of acrylamide-related disease in all food types, and there is a chance of cancer risk in all age groups (all values ˂10,000). Furthermore, it was found that in non-cancer risks, the acrylamide in all tested foods was within the safe limit (˃125), except for potato chips, in which there is a risk for diseases in the population. With potato and coffee as raw materials, additional studies were conducted to assess different factors, including temperature, cocking time, and additives affecting the acrylamide formation in fried potato and roasted coffee, by systematically varying processing temperatures and time values, a mitigation of acrylamide content was achieved when lowering the temperature and decreasing the cooking time. Furthermore, it was shown that the combination of the addition of chitosan and NaCl had a large impact on the formation.

Keywords: risk assessment, dietary exposure, MOA, acrylamide, hazard

Procedia PDF Downloads 39
13203 Risk Factors for Post-Induction Hypotension Among Elderly Patients Undergoing Elective Non-Cardiac Surgery Under General Anesthesia

Authors: Karuna Sutthibenjakul, Sunisa Chatmongkolchart

Abstract:

Background: Postinduction hypotension is common and occurs more often in elderly patients. We aimed to determine risk factors for hypotension after induction among elderly patients (aged 65 years and older) who underwent elective non-cardiac surgery under general anesthesia. Methods: This cohort study analyzed from 580 data between December 2017 and July 2018 at a tertiary university hospital in south of Thailand. Hypotension is defined as more than 30% decrease mean arterial pressure from baseline after induction within 20 minutes or the use of vasopressive agent to treat low blood pressure. Intraoperative parameters were blood pressure and heart rate at T0, TEI, T5, T10, T15 and T20 (immediately after arrival at operating room, time after intubation, 5, 10, 15 and 20 minutes after intubation) respectively. Results: The median age was 72.5 (68, 78) years. A prevalence of post-induction hypotension was 64.8%. The highest prevalence (39.7%) was at 15 minutes after intubation. The association of post-induction hypotension is rising with diuretic drug as preoperative medication (P-value=0.016), hematocrit level (P-value=0.031) and the degree of hypertension immediately after arrival at operating room (P-value<0.001). Increasing fentanyl dosage during induction was associated with hypotension at intubation time (P-value<0.01) and 5 minutes after intubation (P-value<0.001). There was no statistically significant difference in the increasing propofol dosage. Conclusion: The degree of hypertension immediately after arrival at operating room and increasing fentanyl dosage were a significant risk factors for postinduction hypotension in elderly patients.

Keywords: risk factors, post-induction, hypotension, elderly

Procedia PDF Downloads 120
13202 A Surgical Correction and Innovative Splint for Swan Neck Deformity in Hypermobility Syndrome

Authors: Deepak Ganjiwale, Karthik Vishwanathan

Abstract:

Objective: Splinting is a great domain of occupational therapy profession.Making a splint for the patient would depend upon the need or requirement of the problems and deformities. Swan neck deformity is not very common in finger it may occur after any disease. Conservative treatment of the swan neck deformity is available by using different static splints only. There are very few reports of surgical correction of swan-neck deformity in benign hypermobility syndrome. Method: This case report describes the result of surgical intervention and hand splint in a twenty year old lady with past history of cardiovascular stroke with no residual neurological deficit. She presented with correctable swan neck deformity and failed to improve with static ring splints to correct the deformity. She was noted to have hyperlaxity (EhlerDanlos type) as per modified Beighton’s score of 5/9. She underwent volar plate plication of the proximal interphalangeal joint of the left ring finger along with hemitenodesis of ulnar slip of flexor digitorum superficialis (FDS) tendon whereby, the ulnar slip of FDS was passed through a small surgically created rent in A2 pulley and sutured back to itself. Result: Postoperatively, the patient was referred to occupational therapy for splinting with the instruction that the splint would work some time for as static and some time as dynamic for positional and correction of the finger. Conclusion: After occupational therapy intervention and splinting, the patient had a full correction of the swan-neck deformity with near full flexion of the operated finger and is able to work independently.

Keywords: swan neck, finger, deformity, splint, hypermobility

Procedia PDF Downloads 242
13201 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection

Authors: Rubin Dan, Xingcai Wang, Ziyang Chen

Abstract:

A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.

Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising

Procedia PDF Downloads 181
13200 Environmental Potential of Biochar from Wood Biomass Thermochemical Conversion

Authors: Cora Bulmău

Abstract:

Soil polluted with hydrocarbons spills is a major global concern today. As a response to this issue, our experimental study tries to put in evidence the option to choose for one environmentally friendly method: use of the biochar, despite to a classical procedure; incineration of contaminated soil. Biochar represents the solid product obtained through the pyrolysis of biomass, its additional use being as an additive intended to improve the quality of the soil. The positive effect of biochar addition to soil is represented by its capacity to adsorb and contain petroleum products within its pores. Taking into consideration the capacity of the biochar to interact with organic contaminants, the purpose of the present study was to experimentally establish the effects of the addition of wooden biomass-derived biochar on a soil contaminated with oil. So, the contaminated soil was amended with biochar (10%) produced by pyrolysis in different operational conditions of the thermochemical process. After 25 days, the concentration of petroleum hydrocarbons from soil treated with biochar was measured. An analytical method as Soxhlet extraction was adopted to estimate the concentrations of total petroleum products (TPH) in the soil samples: This technique was applied to contaminated soil, also to soils remediated by incineration/adding biochar. The treatment of soil using biochar obtained from pyrolysis of the Birchwood led to a considerable decrease in the concentrations of petroleum products. The incineration treatments conducted under experimental stage to clean up the same soil, contaminated with petroleum products, involved specific parameters: temperature of about 600°C, 800°C and 1000°C and treatment time 30 and 60 minutes. The experimental results revealed that the method using biochar has registered values of efficiency up to those of all incineration processes applied for the shortest time.

Keywords: biochar, biomass, remediaton, soil, TPH

Procedia PDF Downloads 217
13199 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 374
13198 Development and Validation of High-Performance Liquid Chromatography Method for the Determination and Pharmacokinetic Study of Linagliptin in Rat Plasma

Authors: Hoda Mahgoub, Abeer Hanafy

Abstract:

Linagliptin (LNG) belongs to dipeptidyl-peptidase-4 (DPP-4) inhibitor class. DPP-4 inhibitors represent a new therapeutic approach for the treatment of type 2 diabetes in adults. The aim of this work was to develop and validate an accurate and reproducible HPLC method for the determination of LNG with high sensitivity in rat plasma. The method involved separation of both LNG and pindolol (internal standard) at ambient temperature on a Zorbax Eclipse XDB C18 column and a mobile phase composed of 75% methanol: 25% formic acid 0.1% pH 4.1 at a flow rate of 1.0 mL.min-1. UV detection was performed at 254nm. The method was validated in compliance with ICH guidelines and found to be linear in the range of 5–1000ng.mL-1. The limit of quantification (LOQ) was found to be 5ng.mL-1 based on 100µL of plasma. The variations for intra- and inter-assay precision were less than 10%, and the accuracy values were ranged between 93.3% and 102.5%. The extraction recovery (R%) was more than 83%. The method involved a single extraction step of a very small plasma volume (100µL). The assay was successfully applied to an in-vivo pharmacokinetic study of LNG in rats that were administered a single oral dose of 10mg.kg-1 LNG. The maximum concentration (Cmax) was found to be 927.5 ± 23.9ng.mL-1. The area under the plasma concentration-time curve (AUC0-72) was 18285.02 ± 605.76h.ng.mL-1. In conclusion, the good accuracy and low LOQ of the bioanalytical HPLC method were suitable for monitoring the full pharmacokinetic profile of LNG in rats. The main advantages of the method were the sensitivity, small sample volume, single-step extraction procedure and the short time of analysis.

Keywords: HPLC, linagliptin, pharmacokinetic study, rat plasma

Procedia PDF Downloads 232
13197 A Programming Assessment Software Artefact Enhanced with the Help of Learners

Authors: Romeo A. Botes, Imelda Smit

Abstract:

The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.

Keywords: programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method

Procedia PDF Downloads 284
13196 Uncovering Hidden Bugs: An Exploratory Approach

Authors: Sagar Jitendra Mahendrakar

Abstract:

Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.

Keywords: exploratory, testing, automation, quality

Procedia PDF Downloads 22
13195 Heuristic Approaches for Injury Reductions by Reduced Car Use in Urban Areas

Authors: Stig H. Jørgensen, Trond Nordfjærn, Øyvind Teige Hedenstrøm, Torbjørn Rundmo

Abstract:

The aim of the paper is to estimate and forecast road traffic injuries in the coming 10-15 years given new targets in urban transport policy and shifts of mode of transport, including injury cross-effects of mode changes. The paper discusses possibilities and limitations in measuring and quantifying possible injury reductions. Injury data (killed and seriously injured road users) from six urban areas in Norway from 1998-2012 (N= 4709 casualties) form the basis for estimates of changing injury patterns. For the coming period calculation of number of injuries and injury rates by type of road user (categories of motorized versus non-motorized) by sex, age and type of road are made. A prognosticated population increase (25 %) in total population within 2025 in the six urban areas will curb the proceeded fall in injury figures. However, policy strategies and measures geared towards a stronger modal shift from use of private vehicles to safer public transport (bus, train) will modify this effect. On the other side will door to door transport (pedestrians on their way to/from public transport nodes) imply a higher exposure for pedestrians (bikers) converting from private vehicle use (including fall accidents not registered as traffic accidents). The overall effect is the sum of these modal shifts in the increasing urban population and in addition diminishing return to the majority of road safety countermeasures has also to be taken into account. The paper demonstrates how uncertainties in the various estimates (prediction factors) on increasing injuries as well as decreasing injury figures may partly offset each other. The paper discusses road safety policy and welfare consequences of transport mode shift, including reduced use of private vehicles, and further environmental impacts. In this regard, safety and environmental issues will as a rule concur. However pursuing environmental goals (e.g. improved air quality, reduced co2 emissions) encouraging more biking may generate more biking injuries. The study was given financial grants from the Norwegian Research Council’s Transport Safety Program.

Keywords: road injuries, forecasting, reduced private care use, urban, Norway

Procedia PDF Downloads 223
13194 Influence of Low and Extreme Heat Fluxes on Thermal Degradation of Carbon Fibre-Reinforced Polymers

Authors: Johannes Bibinger, Sebastian Eibl, Hans-Joachim Gudladt

Abstract:

This study considers the influence of different irradiation scenarios on the thermal degradation of carbon fiber-reinforced polymers (CFRP). Real threats are simulated, such as fires with long-lasting low heat fluxes and nuclear heat flashes with short-lasting high heat fluxes. For this purpose, coated and uncoated quasi-isotropic samples of the commercially available CFRP HexPly® 8552/IM7 are thermally irradiated from one side by a cone calorimeter and a xenon short-arc lamp with heat fluxes between 5 and 175 W/cm² at varying time intervals. The specimen temperature is recorded on the front and backside as well as at different laminate depths. The CFRP is non-destructively tested with ultrasonic testing, infrared spectroscopy (ATR-FTIR), scanning electron microscopy (SEM), and micro-focused computed X-Ray tomography (μCT). Destructive tests are performed to evaluate the mechanical properties in terms of interlaminar shear strength (ILSS), compressive and tensile strength. The irradiation scenarios vary significantly in heat flux and exposure time. Thus, different heating rates, radiation effects, and temperature distributions occur. This leads to unequal decomposition processes, which affect the sensitivity of the strength type and damage behaviour of the specimens. However, with the use of surface coatings, thermal degradation of composite materials can be delayed.

Keywords: CFRP, one-sided thermal damage, high heat flux, heating rate, non-destructive and destructive testing

Procedia PDF Downloads 96
13193 The Mask of Motherhood a Changing Identity During the Transition to Motherhood

Authors: Geraldine Mc Loughlin, Mary Horgan, Rosaleen Murphy

Abstract:

Childbirth is a life-changing event, a psychological transition for the mother that must be viewed in a social context. Much has been written and documented regarding the preparation for birth and the immediate postnatal period, but the full psychological impact on the mother is not clear. One aspect of the transition process is Identity. Depending on a person’s worldview, the concept of identity is viewed differently; the nature of reality and how they construct knowledge influence these perspectives. Becoming a mother is not just an event but a process that time and experience will help to shape the understanding of the woman. To explore the emotional and psychological aspects of first-time mother’s experience during the transition to new motherhood. To identify factors affecting women’s identities in the period of 36 weeks gestation to 12 weeks postpartum. Interpretative Phenomenological Analysis (IPA) was used. It explores how these women make sense of and give meaning to their experiences. IPA is underpinned by 3 key principles: phenomenology, hermeneutics and idiographics. A purposeful sample of 10 women was recruited for this longitudinal study, to enable data to be collected during the transition to motherhood. Individual identity was interpreted and viewed as developing in response to changing contexts, such as the birth event becoming a parent, enabling one to construct one’s own sense of a meaningful life. Women effectively differentiated themselves from their personal and social identities and took responsibility for their actions. Identity is culturally and socially shaped and experienced, though not experienced similarly by all women. The individualized perspective on identity recognizes that (a) social influences are seen as external to the individual and (b) the view that social influences are, in fact, internalized by the individual.

Keywords: motherhood, transition, identity, IPA

Procedia PDF Downloads 44
13192 Applying (1, T) Ordering Policy in a Multi-Vendor-Single-Buyer Inventory System with Lost Sales and Poisson Demand

Authors: Adel Nikfarjam, Hamed Tayebi, Sadoullah Ebrahimnejad

Abstract:

This paper considers a two-echelon inventory system with a number of warehouses and a single retailer. The retailer replenishes its required items from warehouses, and assembles them into a single final product. We assume that each warehouse supplies only one kind of the raw material for the retailer. The demand process of the final product is assumed to be Poissson, and unsatisfied demand of the final product will be lost. The retailer applies one-for-one-period ordering policy which is also known as (1, T) ordering policy. In this policy the retailer orders to each warehouse a fixed quantity of each item at fixed time intervals, which the fixed quantity is equal to the utilization of the item in the final product. Since, this policy eliminates all demand uncertainties at the upstream echelon, the standard lot sizing model can be applied at all warehouses. In this paper, we calculate the total cost function of the inventory system. Then, based on this function, we present a procedure to obtain the optimal time interval between two consecutive order placements from retailer to the warehouses, and the optimal order quantities of warehouses (assuming that there are positive ordering costs at warehouses). Finally, we present some numerical examples, and conduct numerical sensitivity analysis for cost parameters.

Keywords: two-echelon supply chain, multi-vendor-single-buyer inventory system, lost sales, Poisson demand, one-for-one-period policy, lot sizing model

Procedia PDF Downloads 296
13191 Quantum Engine Proposal using Two-level Atom Like Manipulation and Relativistic Motoring Control

Authors: Montree Bunruangses, Sonath Bhattacharyya, Somchat Sonasang, Preecha Yupapin

Abstract:

A two-level system is manipulated by a microstrip add-drop circuit configured as an atom like system for wave-particle behavior investigation when its traveling speed along the circuit perimeter is the speed of light. The entangled pair formed by the upper and lower sideband peaks is bound by the angular displacement, which is given by 0≤θ≤π/2. The control signals associated with 3-peak signal frequencies are applied by the external inputs via the microstrip add-drop multiplexer ports, where they are time functions without the space term involved. When a system satisfies the speed of light conditions, the mass term has been changed to energy based on the relativistic limit described by the Lorentz factor and Einstein equation. The different applied frequencies can be utilized to form the 3-phase torques that can be applied for quantum engines. The experiment will use the two-level system circuit and be conducted in the laboratory. The 3-phase torques will be recorded and investigated for quantum engine driving purpose. The obtained results will be compared to the simulation. The optimum amplification of torque can be obtained by the resonant successive filtering operation. Torque will be vanished when the system is balanced at the stopped position, where |Time|=0, which is required to be a system stability condition. It will be discussed for future applications. A larger device may be tested in the future for realistic use. A synchronous and asynchronous driven motor is also discussed for the warp drive use.

Keywords: quantum engine, relativistic motor, 3-phase torque, atomic engine

Procedia PDF Downloads 45
13190 Iron Catalyst for Decomposition of Methane: Influence of Al/Si Ratio Support

Authors: A. S. Al-Fatesh, A. A. Ibrahim, A. M. AlSharekh, F. S. Alqahtani, S. O. Kasim, A. H. Fakeeha

Abstract:

Hydrogen is the expected future fuel since it produces energy without any pollution. It can be used as a fuel directly or through the fuel cell. It is also used in chemical and petrochemical industry as reducing agent or in hydrogenation processes. It is produced by different methods such as reforming of hydrocarbon, electrolytic method and methane decomposition. The objective of the present paper is to study the decomposition of methane reaction at 700°C and 800°C. The catalysts were prepared via impregnation method using 20%Fe and different proportions of combined alumina and silica support using the following ratios [100%, 90%, 80%, and 0% Al₂O₃/SiO₂]. The prepared catalysts were calcined and activated at 600 OC and 500 OC respectively. The reaction was carried out in fixed bed reactor at atmospheric pressure using 0.3g of catalyst and feed gas ratio of 1.5/1 CH₄/N₂ with a total flow rate 25 mL/min. Catalyst characterizations (TPR, TGA, BET, XRD, etc.) have been employed to study the behavior of catalysts before and after the reaction. Moreover, a brief description of the weight loss and the CH₄ conversions versus time on stream relating the different support ratios over 20%Fe/Al₂O₃/SiO₂ catalysts has been added as well. The results of TGA analysis provided higher weights losses for catalysts operated at 700°C than 800°C. For the 90% Al₂O₃/SiO₂, the activity decreases with the time on stream using 800°C reaction temperature from 73.9% initial CH₄ conversion to 46.3% for a period of 300min, whereas the activity for the same catalyst increases from 47.1% to 64.8% when 700°C reaction temperature is employed. Likewise, for 80% Al₂O₃/SiO₂ the trend of activity is similar to that of 90% Al₂O₃/SiO₂ but with a different rate of activity variation. It can be inferred from the activity results that the ratio of Al₂O₃ to SiO₂ is crucial and it is directly proportional with the activity. Whenever the Al/Si ratio decreases the activity declines. Indeed, the CH₄ conversion of 100% SiO₂ support was less than 5%.

Keywords: Al₂O₃, SiO₂, CH₄ decomposition, hydrogen, iron

Procedia PDF Downloads 168
13189 Virtual Team Performance: A Transactive Memory System Perspective

Authors: Belbaly Nassim

Abstract:

Virtual teams (VT) initiatives, in which teams are geographically dispersed and communicate via modern computer-driven technologies, have attracted increasing attention from researchers and professionals. The growing need to examine how to balance and optimize VT is particularly important given the exposure experienced by companies when their employees encounter globalization and decentralization pressures to monitor VT performance. Hence, organization is regularly limited due to misalignment between the behavioral capabilities of the team’s dispersed competences and knowledge capabilities and how trust issues interplay and influence these VT dimensions and the effects of such exchanges. In fact, the future success of business depends on the extent to which VTs are managing efficiently their dispersed expertise, skills and knowledge to stimulate VT creativity. Transactive memory system (TMS) may enhance VT creativity using its three dimensons: knowledge specialization, credibility and knowledge coordination. TMS can be understood as a composition of both a structural component residing of individual knowledge and a set of communication processes among individuals. The individual knowledge is shared while being retrieved, applied and the learning is coordinated. TMS is driven by the central concept that the system is built on the distinction between internal and external memory encoding. A VT learns something new and catalogs it in memory for future retrieval and use. TMS uses the role of information technology to explain VT behaviors by offering VT members the possibility to encode, store, and retrieve information. TMS considers the members of a team as a processing system in which the location of expertise both enhances knowledge coordination and builds trust among members over time. We build on TMS dimensions to hypothesize the effects of specialization, coordination, and credibility on VT creativity. In fact, VTs consist of dispersed expertise, skills and knowledge that can positively enhance coordination and collaboration. Ultimately, this team composition may lead to recognition of both who has expertise and where that expertise is located; over time, the team composition may also build trust among VT members over time developing the ability to coordinate their knowledge which can stimulate creativity. We also assess the reciprocal relationship between TMS dimensions and VT creativity. We wish to use TMS to provide researchers with a theoretically driven model that is empirically validated through survey evidence. We propose that TMS provides a new way to enhance and balance VT creativity. This study also provides researchers insight into the use of TMS to influence positively VT creativity. In addition to our research contributions, we provide several managerial insights into how TMS components can be used to increase performance within dispersed VTs.

Keywords: virtual team creativity, transactive memory systems, specialization, credibility, coordination

Procedia PDF Downloads 149
13188 Oxidation Assessment of Mayonnaise with Headspace Single-Drop Microextarction (HS-SDME) Coupled with Gas Chromatography-Mass Spectrometry (GC-MS) during Shelf-Life

Authors: Kooshan Nayebzadeh, Maryam Enteshari, Abdorreza Mohammadi

Abstract:

The oxidative stability of mayonnaise under different storage temperatures (4 and 25˚C) during 6-month shelf-life was investigated by different analytical methods. In this study, headspace single-drop microextarction (HS-SDME) combined with gas chromatography-mass spectrometry (GC-MS) as a green, sensitive and rapid technique was applied to evaluate oxidative state in mayonnaise. Oxidation changes of extracted oil from mayonnaise were monitored by analytical parameters including peroxide value (PV), p-Anisidine value (p-An V), thiobarbituric acid value (TBA), and oxidative stability index (OSI). Hexanal and heptanal as secondary volatile oxidation compounds were determined by HS-SDME/GC-MS method in mayonnaise matrix. The rate of oxidation in mayonnaises increased during storage and it was determined greater at 25 ˚C. The values of Anisidine and TBA were gradually enhanced during 6 months, while the amount of OSI decreased. At both temperatures, the content of hexanal was higher than heptanal during all storage periods. Also significant increments in hexanal and heptanal concentrations in the second and sixth month of storage have been observed. Hexanal concentrations in mayonnaises which were stored at 25 ˚C and during storage time showed the highest values. It can be concluded that the temperature and duration of storage time are definitive parameters which affect on quality and oxidative stability of mayonnaise. Additionally, hexanal content in comparison to heptanal is a more reliable oxidative indicator and HS-SDME/GC-MS can be applied in a quick and simple manner.

Keywords: oxidative stability, mayonnaise, headspace single-drop microextarction (HS-SDME), shelf-life

Procedia PDF Downloads 410
13187 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 157
13186 Approximation of Geodesics on Meshes with Implementation in Rhinoceros Software

Authors: Marian Sagat, Mariana Remesikova

Abstract:

In civil engineering, there is a problem how to industrially produce tensile membrane structures that are non-developable surfaces. Nondevelopable surfaces can only be developed with a certain error and we want to minimize this error. To that goal, the non-developable surfaces are cut into plates along to the geodesic curves. We propose a numerical algorithm for finding approximations of open geodesics on meshes and surfaces based on geodesic curvature flow. For practical reasons, it is important to automatize the choice of the time step. We propose a method for automatic setting of the time step based on the diagonal dominance criterion for the matrix of the linear system obtained by discretization of our partial differential equation model. Practical experiments show reliability of this method. Because approximation of the model is made by numerical method based on classic derivatives, it is necessary to solve obstacles which occur for meshes with sharp corners. We solve this problem for big family of meshes with sharp corners via special rotations which can be seen as partial unfolding of the mesh. In practical applications, it is required that the approximation of geodesic has its vertices only on the edges of the mesh. This problem is solved by a specially designed pointing tracking algorithm. We also partially solve the problem of finding geodesics on meshes with holes. We implemented the whole algorithm in Rhinoceros (commercial 3D computer graphics and computer-aided design software ). It is done by using C# language as C# assembly library for Grasshopper, which is plugin in Rhinoceros.

Keywords: geodesic, geodesic curvature flow, mesh, Rhinoceros software

Procedia PDF Downloads 133
13185 Improve of Biomass Properties through Torrefaction Process

Authors: Malgorzata Walkowiak, Magdalena Witczak, Wojciech Cichy

Abstract:

Biomass is an important renewable energy source in Poland. As a biofuel, it has many advantages like renewable in noticeable time and relatively high energy potential. But disadvantages of biomass like high moisture content and hygroscopic nature causes that gaining, transport, storage and preparation for combustion become troublesome and uneconomic. Thermal modification of biomass can improve hydrophobic properties, increase its calorific value and natural resistance. This form of thermal processing is known as torrefaction. The aim of the study was to investigate the effect of the pre-heat treatment of wood and plant lignocellulosic raw materials on the properties of solid biofuels. The preliminary studies included pine, beech and willow wood and other lignocellulosic raw materials: mustard, hemp, grass stems, tobacco stalks, sunflower husks, Miscanthus straw, rape straw, cereal straw, Virginia Mallow straw, rapeseed meal. Torrefaction was carried out using variable temperatures and time of the process, depending on the material used. It was specified the weight loss and the ash content and calorific value was determined. It was found that the thermal treatment of the tested lignocellulosic raw materials is able to provide solid biofuel with improved properties. In the woody materials, the increase of the lower heating value was in the range of 0,3 MJ/kg (pine and beech) to 1,1 MJ/kg (willow), in non-woody materials – from 0,5 MJ/kg (tobacco stalks, Miscanthus) to 3,5 MJ/kg (rapeseed meal). The obtained results indicate for further research needs, particularly in terms of conditions of the torrefaction process.

Keywords: biomass, lignocellulosic materials, solid biofuels, torrefaction

Procedia PDF Downloads 222
13184 Membrane Distillation Process Modeling: Dynamical Approach

Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati

Abstract:

This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.

Keywords: membrane distillation, dynamical modeling, advection-diffusion equation, thermal equilibrium, heat equation

Procedia PDF Downloads 255
13183 Distinguishing between Bacterial and Viral Infections Based on Peripheral Human Blood Tests Using Infrared Microscopy and Multivariate Analysis

Authors: H. Agbaria, A. Salman, M. Huleihel, G. Beck, D. H. Rich, S. Mordechai, J. Kapelushnik

Abstract:

Viral and bacterial infections are responsible for variety of diseases. These infections have similar symptoms like fever, sneezing, inflammation, vomiting, diarrhea and fatigue. Thus, physicians may encounter difficulties in distinguishing between viral and bacterial infections based on these symptoms. Bacterial infections differ from viral infections in many other important respects regarding the response to various medications and the structure of the organisms. In many cases, it is difficult to know the origin of the infection. The physician orders a blood, urine test, or 'culture test' of tissue to diagnose the infection type when it is necessary. Using these methods, the time that elapses between the receipt of patient material and the presentation of the test results to the clinician is typically too long ( > 24 hours). This time is crucial in many cases for saving the life of the patient and for planning the right medical treatment. Thus, rapid identification of bacterial and viral infections in the lab is of great importance for effective treatment especially in cases of emergency. Blood was collected from 50 patients with confirmed viral infection and 50 with confirmed bacterial infection. White blood cells (WBCs) and plasma were isolated and deposited on a zinc selenide slide, dried and measured under a Fourier transform infrared (FTIR) microscope to obtain their infrared absorption spectra. The acquired spectra of WBCs and plasma were analyzed in order to differentiate between the two types of infections. In this study, the potential of FTIR microscopy in tandem with multivariate analysis was evaluated for the identification of the agent that causes the human infection. The method was used to identify the infectious agent type as either bacterial or viral, based on an analysis of the blood components [i.e., white blood cells (WBC) and plasma] using their infrared vibrational spectra. The time required for the analysis and evaluation after obtaining the blood sample was less than one hour. In the analysis, minute spectral differences in several bands of the FTIR spectra of WBCs were observed between groups of samples with viral and bacterial infections. By employing the techniques of feature extraction with linear discriminant analysis (LDA), a sensitivity of ~92 % and a specificity of ~86 % for an infection type diagnosis was achieved. The present preliminary study suggests that FTIR spectroscopy of WBCs is a potentially feasible and efficient tool for the diagnosis of the infection type.

Keywords: viral infection, bacterial infection, linear discriminant analysis, plasma, white blood cells, infrared spectroscopy

Procedia PDF Downloads 209
13182 Ultrasonic Extraction of Phenolics from Leaves of Shallots and Peels of Potatoes for Biofortification of Cheese

Authors: Lila Boulekbache-Makhlouf, Fatiha Brahmi

Abstract:

This study was carried out with the aim of enriching fresh cheese with the food by-products, which are the leaves of shallots and the peels of potatoes. Firstly, the conditions for extracting the total polyphenols using ultrasound are optimized. Then, the contents of total polyphenols PPT , flavonoids and antioxidant activity were evaluated for the extracts obtained by adopting the optimal parameter. On the other hand, we have carried out some physicochemical, microbiological and sensory analyzes of the cheese produced. The maximum total polyphenols value of 70.44 mg GAE gallic acid equivalent / g of dry matter DM of shallot leaves was reached with 40% (v/v) ethanol, an extraction time of 90 min and a temperature of 10 °C. While, the maximum TPP total polyphenols content of potato peels of 45.03 ± 4.16 mg gallic acid equivalent / g of dry matter DM was obtained using an ethanol /water mixture (40%, v/v), a time of 30 min and a temperature of 60 °C and the flavonoid contents were 13.99 and 7.52 QE quercetin equivalent/g dry matter DM, respectively. From the antioxidant tests, we deduced that the potato peels present a higher antioxidant power with the concentration of extracts causing a 50% inhibition IC50s of 125.42 ± 2.78 μg/mL for 2,2-diphényl 1-picrylhydrazyle DPPH, of 87.21 ± 7.72 μg/mL for phosphomolybdate and 200.77 ± 13.38 μg/mL for iron chelation, compared with the results obtained for shallot leaves which were 204.29 ± 0.09, 45.85 ± 3,46 and 1004.10 ± 145.73 μg/mL, respectively. The results of the physicochemical analyzes have shown that the formulated cheese was compliant with standards. Microbiological analyzes show that the hygienic quality of the cheese produced was satisfactory. According to the sensory analysis, the experts liked the cheese enriched with the powder and pieces of the leaves of the shallots.

Keywords: shallots leaves, potato peels, ultrasound extraction, phenolics, cheese

Procedia PDF Downloads 69
13181 Interpretation of Two Indices for the Prediction of Cardiovascular Risk in Pediatric Obesity

Authors: Mustafa M. Donma, Orkide Donma

Abstract:

Obesity and weight gain are associated with increased risk of developing cardiovascular diseases and the progression of liver fibrosis. Aspartate transaminase–to-platelet count ratio index (AST-to-PLT, APRI) and fibrosis-4 (FIB-4) were primarily considered as the formulas capable of differentiating hepatitis from cirrhosis. Recently, they have found clinical use as measures of liver fibrosis and cardiovascular risk. However, their status in children has not been evaluated in detail yet. The aim of this study is to determine APRI and FIB-4 status in obese (OB) children and compare them with values found in children with normal body mass index (N-BMI). A total of sixty-eight children examined in the outpatient clinics of the Pediatrics Department in Tekirdag Namik Kemal University Medical Faculty were included in the study. Two groups were constituted. In the first group, thirty-five children with N-BMI, whose age- and sex-dependent BMI indices vary between 15 and 85 percentiles, were evaluated. The second group comprised thirty-three OB children whose BMI percentile values were between 95 and 99. Anthropometric measurements and routine biochemical tests were performed. Using these parameters, values for the related indices, BMI, APRI, and FIB-4, were calculated. Appropriate statistical tests were used for the evaluation of the study data. The statistical significance degree was accepted as p<0.05. In the OB group, values found for APRI and FIB-4 were higher than those calculated for the N-BMI group. However, there was no statistically significant difference between the N-BMI and OB groups in terms of APRI and FIB-4. A similar pattern was detected for triglyceride (TRG) values. The correlation coefficient and degree of significance between APRI and FIB-4 were r=0.336 and p=0.065 in the N-BMI group. On the other hand, they were r=0.707 and p=0.001 in the OB group. Associations of these two indices with TRG have shown that this parameter was strongly correlated (p<0.001) both with APRI and FIB-4 in the OB group, whereas no correlation was calculated in children with N-BMI. Triglycerides are associated with an increased risk of fatty liver, which can progress to severe clinical problems such as steatohepatitis, which can lead to liver fibrosis. Triglycerides are also independent risk factors for cardiovascular disease. In conclusion, the lack of correlation between TRG and APRI as well as FIB-4 in children with N-BMI, along with the detection of strong correlations of TRG with these indices in OB children, was the indicator of the possible onset of the tendency towards the development of fatty liver in OB children. This finding also pointed out the potential risk for cardiovascular pathologies in OB children. The nature of the difference between APRI vs FIB-4 correlations in N-BMI and OB groups (no correlation versus high correlation), respectively, may be the indicator of the importance of involving age and alanine transaminase parameters in addition to AST and PLT in the formula designed for FIB-4.

Keywords: APRI, children, FIB-4, obesity, triglycerides

Procedia PDF Downloads 336