Search results for: accuracy assessment.
1382 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction
Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong
Abstract:
Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.Keywords: data refinement, machine learning, mutual information, short-term latency prediction
Procedia PDF Downloads 1671381 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data
Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin
Abstract:
The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline
Procedia PDF Downloads 3071380 Storms Dynamics in the Black Sea in the Context of the Climate Changes
Authors: Eugen Rusu
Abstract:
The objective of the work proposed is to perform an analysis of the wave conditions in the Black Sea basin. This is especially focused on the spatial and temporal occurrences and on the dynamics of the most extreme storms in the context of the climate changes. A numerical modelling system, based on the spectral phase averaged wave model SWAN, has been implemented and validated against both in situ measurements and remotely sensed data, all along the sea. Moreover, a successive correction method for the assimilation of the satellite data has been associated with the wave modelling system. This is based on the optimal interpolation of the satellite data. Previous studies show that the process of data assimilation improves considerably the reliability of the results provided by the modelling system. This especially concerns the most sensitive cases from the point of view of the accuracy of the wave predictions, as the extreme storm situations are. Following this numerical approach, it has to be highlighted that the results provided by the wave modelling system above described are in general in line with those provided by some similar wave prediction systems implemented in enclosed or semi-enclosed sea basins. Simulations of this wave modelling system with data assimilation have been performed for the 30-year period 1987-2016. Considering this database, the next step was to analyze the intensity and the dynamics of the higher storms encountered in this period. According to the data resulted from the model simulations, the western side of the sea is considerably more energetic than the rest of the basin. In this western region, regular strong storms provide usually significant wave heights greater than 8m. This may lead to maximum wave heights even greater than 15m. Such regular strong storms may occur several times in one year, usually in the wintertime, or in late autumn, and it can be noticed that their frequency becomes higher in the last decade. As regards the case of the most extreme storms, significant wave heights greater than 10m and maximum wave heights close to 20m (and even greater) may occur. Such extreme storms, which in the past were noticed only once in four or five years, are more recent to be faced almost every year in the Black Sea, and this seems to be a consequence of the climate changes. The analysis performed included also the dynamics of the monthly and annual significant wave height maxima as well as the identification of the most probable spatial and temporal occurrences of the extreme storm events. Finally, it can be concluded that the present work provides valuable information related to the characteristics of the storm conditions and on their dynamics in the Black Sea. This environment is currently subjected to high navigation traffic and intense offshore and nearshore activities and the strong storms that systematically occur may produce accidents with very serious consequences.Keywords: Black Sea, extreme storms, SWAN simulations, waves
Procedia PDF Downloads 2461379 Compositional Assessment of Fermented Rice Bran and Rice Bran Oil and Their Effect on High Fat Diet Induced Animal Model
Authors: Muhammad Ali Siddiquee, Md. Alauddin, Md. Omar Faruque, Zakir Hossain Howlader, Mohammad Asaduzzaman
Abstract:
Rice bran (RB) and rice bran oil (RBO) are explored as prominent food components worldwide. In this study, fermented rice bran (FRB) was produced by employing edible gram-positive bacteria (Lactobacillus acidophilus, Lactobacillus bulgaricus, and Bifidobacterium bifidum) at 125 x 10⁵ spore g⁻¹ of rice bran, and investigated to evaluate nutritional quality. The crude rice bran oil (CRBO) was extracted from RB, and its quality was also investigated compared to market-available rice bran oil (MRBO) in Bangladesh. We found that fermentation of rice bran with lactic acid bacteria increased total proteins (29.52%), fat (5.38%), ash (48.47%), crude fiber (38.96%), and moisture (61.04%) and reduced the carbohydrate content (36.61%). We also found that essential amino acids (methionine, tryptophan, threonine, valine, leucine, lysine, histidine, and phenylalanine) and non-essential amino acids (alanine, aspartate, glycine, glutamine, proline, serine, and tyrosine) were increased in FRB except methionine and proline. Moreover, total phenolic content, tannin content, flavonoid content, and antioxidant activity were increased in FRB. The RBO analysis showed that γ-oryzanol content (10.00mg/g) was found in CRBO compared to MRBO (ranging from 7.40 to 12.70 mg/g) and Vitamin-E content 0.20% was found higher in CRBO compared to MRBO (ranging 0.097 to 0.12%). The total saturated (25.16%) and total unsaturated fatty acids (74.44%) were found in CRBO, whereas MRBO contained total saturated (22.08 to 24.13%) and total unsaturated fatty acids (71.91 to 83.29%), respectively. The physiochemical parameters were found satisfactory in all samples except acid value and peroxide value higher in CRBO. Finally, animal experiments showed that FRB and CRBO reduce the body weight, glucose, and lipid profile in high-fat diet-induced animal models. Thus, FRB and RBO could be value-added food supplements for human health.Keywords: fermented rice bran, crude rice bran oil, amino acids, proximate composition, gamma-oryzanol, fatty acids, heavy metals, physiochemical parameters
Procedia PDF Downloads 621378 Carbon Pool Assessment in Community Forests, Nepal
Authors: Medani Prasad Rijal
Abstract:
Forest itself is a factory as well as product. It supplies tangible and intangible goods and services. It supplies timber, fuel wood, fodder, grass leaf litter as well as non timber edible goods and medicinal and aromatic products additionally provides environmental services. These environmental services are of local, national or even global importance. In Nepal, more than 19 thousands community forests are providing environmental service in less economic benefit than actual efficiency. There is a risk of cost of management of those forest exceeds benefits and forests get converted to open access resources in future. Most of the environmental goods and services do not have markets which mean no prices at which they are available to the consumers, therefore the valuation of these services goods and services establishment of paying mechanism for such services and insure the benefit to community is more relevant in local as well as global scale. There are few examples of carbon trading in domestic level to meet the country wide emission goal. In this contest, the study aims to explore the public attitude towards carbon offsetting and their responsibility over service providers. This study helps in promotion of environment service awareness among general people, service provider and community forest. The research helps to unveil the carbon pool scenario in community forest and willingness to pay for carbon offsetting of people who are consuming more energy than general people and emitting relatively more carbon in atmosphere. The study has assessed the carbon pool status in two community forest and valuated carbon service from community forest through willingness to pay in Dharan municipality situated in eastern. In the study, in two community forests carbon pools were assessed following the guideline “Forest Carbon Inventory Guideline 2010” prescribed by Ministry of Forest and soil Conservation, Nepal. Final outcomes of analysis in intensively managed area of Hokse CF recorded as 103.58 tons C /ha with 6173.30 tons carbon stock. Similarly in Hariyali CF carbon density was recorded 251.72 mg C /ha. The total carbon stock of intensively managed blocks in Hariyali CF is 35839.62 tons carbon.Keywords: carbon, offsetting, sequestration, valuation, willingness to pay
Procedia PDF Downloads 3551377 The Reality of Teaching Arabic for Specific Purposes in Educational Institutions
Authors: Mohammad Anwarul Kabir, Fayezul Islam
Abstract:
Language invariably is learned / taught to be used primarily as means of communications. Teaching a language for its native audience differs from teaching it to non-native audience. Moreover, teaching a language for communication only is different from teaching it for specific purposes. Arabic language is primarily regarded as the language of the Quran and the Sunnah (Prophetic tradition). Arabic is, therefore, learnt and spread all over the globe. However, Arabic is also a cultural heritage shared by all Islamic nations which has used Arabic for a long period to record the contributions of Muslim thinkers made in the field of wide spectrum of knowledge and scholarship. That is why the phenomenon of teaching Arabic by different educational institutes became quite rife, and the idea of teaching Arabic for specific purposes is heavily discussed in the academic sphere. Although the number of learners of Arabic is increasing consistently, yet their purposes vary. These include religious purpose, international trade, diplomatic purpose, better livelihood in the Arab world extra. By virtue of this high demand for learning Arabic, numerous institutes have been established all over the world including Bangladesh. This paper aims at focusing on the current status of the language institutes which has been established for learning Arabic for specific purposes in Bangladesh including teaching methodology, curriculum, and teachers’ quality. Such curricula and using its materials resulted in a lot of problems. The least, it confused teachers and students as well. Islamic educationalists have been working hard to professionally meet the need. They are following a systematic approach of stating clear and achievable goals, building suitable content, and applying new technology to present these learning experiences and evaluate them. It also suggests a model for designing instructional systems that responds to the need of non-Arabic speaking Islamic communities and provide the knowledge needed in both linguistic and cultural aspects. It also puts forward a number of suggestions for the improvement of the teaching / learning Arabic for specific purposes in Bangladesh after a detailed investigation in the following areas: curriculum, teachers’ skills, method of teaching and assessment policy.Keywords: communication, Quran, sunnah, educational institutes, specific purposes, curriculum, method of teaching
Procedia PDF Downloads 2811376 Assessment of Groundwater Aquifer Impact from Artificial Lagoons and the Reuse of Wastewater in Qatar
Authors: H. Aljabiry, L. Bailey, S. Young
Abstract:
Qatar is a desert with an average temperature 37⁰C, reaching over 40⁰C during summer. Precipitation is uncommon and mostly in winter. Qatar depends on desalination for drinking water and on groundwater and recycled water for irrigation. Water consumption and network leakage per capita in Qatar are amongst the highest in the world; re-use of treated wastewater is extremely limited with only 14% of treated wastewater being used for irrigation. This has led to the country disposing of unwanted water from various sources in lagoons situated around the country, causing concern over the possibility of environmental pollution. Accordingly, our hypothesis underpinning this research is that the quality and quantity of water in lagoons is having an impact on the groundwater reservoirs in Qatar. Lagoons (n = 14) and wells (n = 55) were sampled for both summer and winter in 2018 (summer and winter). Water, adjoining soil and plant samples were analysed for multiple elements by Inductively Coupled Plasma Mass Spectrometry. Organic and inorganic carbon were measured (CN analyser) and the major anions were determined by ion chromatography. Salinization in both the lagoon and the wells was seen with good correlations between Cl⁻, Na⁺, Li, SO₄, S, Sr, Ca, Ti (p-value < 0.05). Association of heavy metals was observed of Ni, Cu, Ag, and V, Cr, Mo, Cd which is due to contamination from anthropological activities such as wastewater disposal or spread of contaminated dust. However, looking at each elements none of them exceeds the Qatari regulation. Moreover, gypsum saturation in the system was observed in both the lagoon and wells water samples. Lagoons and the water of the well are found to be of a saline type as well as Ca²⁺, Cl⁻, SO₄²⁻ type evidencing both gypsum dissolution and salinization in the system. Moreover, Maps produced by Inverse distance weighting showed an increasing level of Nitrate in the groundwater in winter, and decrease chloride and sulphate level, indicating recharge effect after winter rain events. While E. coli and faecal bacteria were found in most of the lagoons, biological analysis for wells needs to be conducted to understand the biological contamination from lagoon water infiltration. As a conclusion, while both the lagoon and the well showed the same results, more sampling is needed to understand the impact of the lagoons on the groundwater.Keywords: groundwater quality, lagoon, treated wastewater, water management, wastewater treatment, wetlands
Procedia PDF Downloads 1341375 A Furniture Industry Concept for a Sustainable Generative Design Platform Employing Robot Based Additive Manufacturing
Authors: Andrew Fox, Tao Zhang, Yuanhong Zhao, Qingping Yang
Abstract:
The furniture manufacturing industry has been slow in general to adopt the latest manufacturing technologies, historically relying heavily upon specialised conventional machinery. This approach not only requires high levels of specialist process knowledge, training, and capital investment but also suffers from significant subtractive manufacturing waste and high logistics costs due to the requirement for centralised manufacturing, with high levels of furniture product not re-cycled or re-used. This paper aims to address the problems by introducing suitable digital manufacturing technologies to create step changes in furniture manufacturing design, as the traditional design practices have been reported as building in 80% of environmental impact. In this paper, a 3D printing robot for furniture manufacturing is reported. The 3D printing robot mainly comprises a KUKA industrial robot, an Arduino microprocessor, and a self-assembled screw fed extruder. Compared to traditional 3D printer, the 3D printing robot has larger motion range and can be easily upgraded to enlarge the maximum size of the printed object. Generative design is also investigated in this paper, aiming to establish a combined design methodology that allows assessment of goals, constraints, materials, and manufacturing processes simultaneously. ‘Matrixing’ for part amalgamation and product performance optimisation is enabled. The generative design goals of integrated waste reduction increased manufacturing efficiency, optimised product performance, and reduced environmental impact institute a truly lean and innovative future design methodology. In addition, there is massive future potential to leverage Single Minute Exchange of Die (SMED) theory through generative design post-processing of geometry for robot manufacture, resulting in ‘mass customised’ furniture with virtually no setup requirements. These generatively designed products can be manufactured using the robot based additive manufacturing. Essentially, the 3D printing robot is already functional; some initial goals have been achieved and are also presented in this paper.Keywords: additive manufacturing, generative design, robot, sustainability
Procedia PDF Downloads 1301374 Evaluation of a Staffing to Workload Tool in a Multispecialty Clinic Setting
Authors: Kristin Thooft
Abstract:
— Increasing pressure to manage healthcare costs has resulted in shifting care towards ambulatory settings and is driving a focus on cost transparency. There are few nurse staffing to workload models developed for ambulatory settings, less for multi-specialty clinics. Of the existing models, few have been evaluated against outcomes to understand any impact. This evaluation took place after the AWARD model for nurse staffing to workload was implemented in a multi-specialty clinic at a regional healthcare system in the Midwest. The multi-specialty clinic houses 26 medical and surgical specialty practices. The AWARD model was implemented in two specialty practices in October 2020. Donabedian’s Structure-Process-Outcome (SPO) model was used to evaluate outcomes based on changes to the structure and processes of care provided. The AWARD model defined and quantified the processes, recommended changes in the structure of day-to-day nurse staffing. Cost of care per patient visit, total visits, a total nurse performed visits used as structural and process measures, influencing the outcomes of cost of care and access to care. Independent t-tests were used to compare the difference in variables pre-and post-implementation. The SPO model was useful as an evaluation tool, providing a simple framework that is understood by a diverse care team. No statistically significant changes in the cost of care, total visits, or nurse visits were observed, but there were differences. Cost of care increased and access to care decreased. Two weeks into the post-implementation period, the multi-specialty clinic paused all non-critical patient visits due to a second surge of the COVID-19 pandemic. Clinic nursing staff was re-allocated to support the inpatient areas. This negatively impacted the ability of the Nurse Manager to utilize the AWARD model to plan daily staffing fully. The SPO framework could be used for the ongoing assessment of nurse staffing performance. Additional variables could be measured, giving a complete picture of the impact of nurse staffing. Going forward, there must be a continued focus on the outcomes of care and the value of nursingKeywords: ambulatory, clinic, evaluation, outcomes, staffing, staffing model, staffing to workload
Procedia PDF Downloads 1721373 Impact of Climate Change on Forest Ecosystem Services: In situ Biodiversity Conservation and Sustainable Management of Forest Resources in Tropical Forests
Authors: Rajendra Kumar Pandey
Abstract:
Forest genetic resources not only represent regional biodiversity but also have immense value as the wealth for securing livelihood of poor people. These are vulnerable to ecological due to depletion/deforestation and /or impact of climate change. These resources of various plant categories are vulnerable on the floor of natural tropical forests, and leading to the threat on the growth and development of future forests. More than 170 species, including NTFPs, are in critical condition for their survival in natural tropical forests of Central India. Forest degradation, commensurate with biodiversity loss, is now pervasive, disproportionately affecting the rural poor who directly depend on forests for their subsistence. Looking ahead the interaction between forest and water, soil, precipitation, climate change, etc. and its impact on biodiversity of tropical forests, it is inevitable to develop co-operation policies and programmes to address new emerging realities. Forests ecosystem also known as the 'wealth of poor' providing goods and ecosystem services on a sustainable basis, are now recognized as a stepping stone to move poor people beyond subsistence. Poverty alleviation is the prime objective of the Millennium Development Goals (MDGs). However, environmental sustainability including other MDGs, is essential to ensure successful elimination of poverty and well being of human society. Loss and degradation of ecosystem are the most serious threats to achieving development goals worldwide. Millennium Ecosystem Assessment (MEA, 2005) was an attempt to identify provisioning and regulating cultural and supporting ecosystem services to provide livelihood security of human beings. Climate change may have a substantial impact on ecological structure and function of forests, provisioning, regulations and management of resources which can affect sustainable flow of ecosystem services. To overcome these limitations, policy guidelines with respect to planning and consistent research strategy need to be framed for conservation and sustainable development of forest genetic resources.Keywords: climate change, forest ecosystem services, sustainable forest management, biodiversity conservation
Procedia PDF Downloads 2961372 Importance of CT and Timed Barium Esophagogram in the Contemporary Treatment of Patients with Achalasia
Authors: Sanja Jovanovic, Aleksandar Simic, Ognjan Skrobic, Dragan Masulovic, Aleksandra Djuric-Stefanovic
Abstract:
Introduction: Achalasia is an idiopathic primary esophageal motility disorder characterized by esophageal peristalsis and impaired swallow-induced relaxation of the lower esophageal sphincter (LES). It is a rare disease that affects both genders with an incidence of 1/100.000 and a prevalence rate of 10/100,000 per year. Objective: Laparoscopic Heller myotomy (LHM) represents a therapy of choice for patients with achalasia, providing excellent outcomes. The aim of this study was to evaluate the significance of computed tomography (CT) in analyzing achalasia subtypes and timed barium esophagogram (TBE) in evaluation of LHM success, as a part of standardized diagnostic protocol. Method: Fifty-one patients with achalasia, confirmed by manometric studies, in addition to standardized diagnostic methods, underwent CT and TBE. CT was done with multiplanar reconstruction, measuring the wall thickness above the esophago-gastric junction in the axial plane. TBE was performed preoperatively and two days postoperatively swallowing low-density barium sulfate, and plane upright frontal films were performed 1, 2 and 5 minutes after the ingestion. In all patients, LHM was done, and pre and postoperative height and weight of the barium column were compared. Results: According to CT findings we divided patients into 3 subtypes of achalasia according to wall thickness: < 4mm as subtype one, between 4 - 9mm as II, and > 10 mm as subtype 3. Correlation of manometric results, as a reference values, and CT findings indicated CT sensitivity of 90% and specificity of 70 % in establishing subtypes of achalasia. The preoperative values of TBE at 1, 2 and 5 minutes were: median barium column height 17.4 ± 7.4, 15.9 ± 6.2 and 13.9 ± 6.2 cm; median column width 5 ± 1.5, 4.7 ± 1.6 and 4.5 ± 1.8 cm respectively. LHM significantly reduced these values (height 7 ± 4.6, 5.8 ± 4.2, 3.7 ± 3.4 cm; width 2.9 ± 1.3, 2.6 ± 1.3 and 2.4 ± 1.4 cm), indicating the quantitative estimates of emptying as excellent (p value < 0.01). Conclusion: CT has high sensitivity and specificity in evaluation of achalasia subtypes, and can be introduced as an additional method for standardized evaluation of these patients. The quantitative assessment of TBE based on measurements of the barium column is an accurate and beneficial method, which adequately estimates esophageal emptying success of LHM.Keywords: achalasia, computed tomography, esophagography, myotomy
Procedia PDF Downloads 2331371 Computational System for the Monitoring Ecosystem of the Endangered White Fish (Chirostoma estor estor) in the Patzcuaro Lake, Mexico
Authors: Cesar Augusto Hoil Rosas, José Luis Vázquez Burgos, José Juan Carbajal Hernandez
Abstract:
White fish (Chirostoma estor estor) is an endemic species that habits in the Patzcuaro Lake, located in Michoacan, Mexico; being an important source of gastronomic and cultural wealth of the area. Actually, it have undergone an immense depopulation of individuals, due to the high fishing, contamination and eutrophication of the lake water, resulting in the possible extinction of this important species. This work proposes a new computational model for monitoring and assessment of critical environmental parameters of the white fish ecosystem. According to an Analytical Hierarchy Process, a mathematical model is built assigning weights to each environmental parameter depending on their water quality importance on the ecosystem. Then, a development of an advanced system for the monitoring, analysis and control of water quality is built using the virtual environment of LabVIEW. As results, we have obtained a global score that indicates the condition level of the water quality in the Chirostoma estor ecosystem (excellent, good, regular and poor), allowing to provide an effective decision making about the environmental parameters that affect the proper culture of the white fish such as temperature, pH and dissolved oxygen. In situ evaluations show regular conditions for a success reproduction and growth rates of this species where the water quality tends to have regular levels. This system emerges as a suitable tool for the water management, where future laws for white fish fishery regulations will result in the reduction of the mortality rate in the early stages of development of the species, which represent the most critical phase. This can guarantees better population sizes than those currently obtained in the aquiculture crop. The main benefit will be seen as a contribution to maintain the cultural and gastronomic wealth of the area and for its inhabitants, since white fish is an important food and economical income of the region, but the species is endangered.Keywords: Chirostoma estor estor, computational system, lab view, white fish
Procedia PDF Downloads 3221370 Study on Effectiveness of Strategies to Re-Establish Landscape Connectivity of Expressways with Reference to Southern Expressway Sri Lanka
Authors: N. G. I. Aroshana, S. Edirisooriya
Abstract:
Construction of highway is the most emerging development tendency in Sri Lanka. With these development activities, there are a lot of environmental and social issues started. Landscape fragmentation is one of the main issues that highly effect to the environment by the construction of expressways. Sri Lankan expressway system getting effort to treat fragmented landscape by using highway crossing structures. This paper designates, a highway post construction landscape study on the effectiveness of the landscape connectivity structures to restore connectivity. Geographic Information Systems (GIS), least cost path tool has been used in the selected two plots; 25km alone the expressway to identify animal crossing paths. Animal accident data use as measure for determining the most contributed plot for landscape connectivity. Number of patches, Mean patch size, Class area use as a parameter to determine the most effective land use class to reestablish the landscape connectivity. The findings of the research express scrub, grass and marsh were the most positively affected land use typologies for increase the landscape connectivity. It represents the growth increased by 8% within the 12 years of time. From the least cost analysis within the plot one, 28.5% of total animal crossing structures are within the high resistance land use classes. Southern expressway used reinforced compressed earth technologies for construction. It has been controlled the growth of the climax community. According to all findings, it could assume that involvement of the landscape crossing structures contributes to re-establish connectivity, but it is not enough to restore the majority of disturbance performed by the expressway. Connectivity measures used within the study can use as a tool for re-evaluate future involvement of highway crossing structures. Proper placement of the highway crossing structures leads to increase the rate of connectivity. The study recommends that monitoring the all stages (preconstruction, construction and post construction) of the project and preliminary design, and the involvement of the research applied connectivity assessment strategies helps to overcome the complication regarding the re-establishment of landscape connectivity using the highway crossing structures that facilitate the growth of flora and fauna.Keywords: landscape fragmentation, least cost path, land use analysis, landscape connectivity structures
Procedia PDF Downloads 1481369 Quantitative Analysis of Three Sustainability Pillars for Water Tradeoff Projects in Amazon
Authors: Taha Anjamrooz, Sareh Rajabi, Hasan Mahmmud, Ghassan Abulebdeh
Abstract:
Water availability, as well as water demand, are not uniformly distributed in time and space. Numerous extra-large water diversion projects are launched in Amazon to alleviate water scarcities. This research utilizes statistical analysis to examine the temporal and spatial features of 40 extra-large water diversion projects in Amazon. Using a network analysis method, the correlation between seven major basins is measured, while the impact analysis method is employed to explore the associated economic, environmental, and social impacts. The study unearths that the development of water diversion in Amazon has witnessed four stages, from a preliminary or initial period to a phase of rapid development. It is observed that the length of water diversion channels and the quantity of water transferred have amplified significantly in the past five decades. As of 2015, in Amazon, more than 75 billion m³ of water was transferred amidst 12,000 km long channels. These projects extend over half of the Amazon Area. The River Basin E is currently the most significant source of transferred water. Through inter-basin water diversions, Amazon gains the opportunity to enhance the Gross Domestic Product (GDP) by 5%. Nevertheless, the construction costs exceed 70 billion US dollars, which is higher than any other country. The average cost of transferred water per unit has amplified with time and scale but reduced from western to eastern Amazon. Additionally, annual total energy consumption for pumping exceeded 40 billion kilowatt-hours, while the associated greenhouse gas emissions are assessed to be 35 million tons. Noteworthy to comprehend that ecological problems initiated by water diversion influence the River Basin B and River Basin D. Due to water diversion, more than 350 thousand individuals have been relocated, away from their homes. In order to enhance water diversion sustainability, four categories of innovative measures are provided for decision-makers: development of water tradeoff projects strategies, improvement of integrated water resource management, the formation of water-saving inducements, and pricing approach, and application of ex-post assessment.Keywords: sustainability, water trade-off projects, environment, Amazon
Procedia PDF Downloads 1281368 The Challenge of Assessing Social AI Threats
Authors: Kitty Kioskli, Theofanis Fotis, Nineta Polemi
Abstract:
The European Union (EU) directive Artificial Intelligence (AI) Act in Article 9 requires that risk management of AI systems includes both technical and human oversight, while according to NIST_AI_RFM (Appendix C) and ENISA AI Framework recommendations, claim that further research is needed to understand the current limitations of social threats and human-AI interaction. AI threats within social contexts significantly affect the security and trustworthiness of the AI systems; they are interrelated and trigger technical threats as well. For example, lack of explainability (e.g. the complexity of models can be challenging for stakeholders to grasp) leads to misunderstandings, biases, and erroneous decisions. Which in turn impact the privacy, security, accountability of the AI systems. Based on the NIST four fundamental criteria for explainability it can also classify the explainability threats into four (4) sub-categories: a) Lack of supporting evidence: AI systems must provide supporting evidence or reasons for all their outputs. b) Lack of Understandability: Explanations offered by systems should be comprehensible to individual users. c) Lack of Accuracy: The provided explanation should accurately represent the system's process of generating outputs. d) Out of scope: The system should only function within its designated conditions or when it possesses sufficient confidence in its outputs. Biases may also stem from historical data reflecting undesired behaviors. When present in the data, biases can permeate the models trained on them, thereby influencing the security and trustworthiness of the of AI systems. Social related AI threats are recognized by various initiatives (e.g., EU Ethics Guidelines for Trustworthy AI), standards (e.g. ISO/IEC TR 24368:2022 on AI ethical concerns, ISO/IEC AWI 42105 on guidance for human oversight of AI systems) and EU legislation (e.g. the General Data Protection Regulation 2016/679, the NIS 2 Directive 2022/2555, the Directive on the Resilience of Critical Entities 2022/2557, the EU AI Act, the Cyber Resilience Act). Measuring social threats, estimating the risks to AI systems associated to these threats and mitigating them is a research challenge. In this paper it will present the efforts of two European Commission Projects (FAITH and THEMIS) from the HorizonEurope programme that analyse the social threats by building cyber-social exercises in order to study human behaviour, traits, cognitive ability, personality, attitudes, interests, and other socio-technical profile characteristics. The research in these projects also include the development of measurements and scales (psychometrics) for human-related vulnerabilities that can be used in estimating more realistically the vulnerability severity, enhancing the CVSS4.0 measurement.Keywords: social threats, artificial Intelligence, mitigation, social experiment
Procedia PDF Downloads 631367 Exploring Error-Minimization Protocols for Upper-Limb Function During Activities of Daily Life in Chronic Stroke Patients
Authors: M. A. Riurean, S. Heijnen, C. A. Knott, J. Makinde, D. Gotti, J. VD. Kamp
Abstract:
Objectives: The current study is done in preparation for a randomized controlled study investigating the effects of an implicit motor learning protocol implemented using an extension-supporting glove. It will explore different protocols to find out which is preferred when studying motor learn-ing in the chronic stroke population that struggles with hand spasticity. Design: This exploratory study will follow 24 individuals who have a chronic stroke (> 6 months) during their usual care journey. We will record the results of two 9-Hole Peg Tests (9HPT) done during their therapy ses-sions with a physiotherapist or in their home before and after 4 weeks of them wearing an exten-sion-supporting glove used to employ the to-be-studied protocols. The participants will wear the glove 3 times/week for one hour while performing their activities of daily living and record the times they wore it in a diary. Their experience will be monitored through telecommunication once every week. Subjects: Individuals that have had a stroke at least 6 months prior to participation, hand spasticity measured on the modified Ashworth Scale of maximum 3, and finger flexion motor control measured on the Motricity Index of at least 19/33. Exclusion criteria: extreme hemi-neglect. Methods: The participants will be randomly divided into 3 groups: one group using the glove in a pre-set way of decreasing support (implicit motor learning), one group using the glove in a self-controlled way of decreasing support (autonomous motor learning), and the third using the glove with constant support (as control). Before and after the 4-week period, there will be an intake session and a post-assessment session. Analysis: We will compare the results of the two 9HPTs to check whether the protocols were effective. Furthermore, we will compare the results between the three groups to find the preferred one. A qualitative analysis will be run of the experience of participants throughout the 4-week period. Expected results: We expect that the group using the implicit learning protocol will show superior results.Keywords: implicit learning, hand spasticity, stroke, error minimization, motor task
Procedia PDF Downloads 571366 Coherent Optical Tomography Imaging of Epidermal Hyperplasia in Vivo in a Mouse Model of Oxazolone Induced Atopic Dermatitis
Authors: Eric Lacoste
Abstract:
Laboratory animals are currently widely used as a model of human pathologies in dermatology such as atopic dermatitis (AD). These models provide a better understanding of the pathophysiology of this complex and multifactorial disease, the discovery of potential new therapeutic targets and the testing of the efficacy of new therapeutics. However, confirmation of the correct development of AD is mainly based on histology from skin biopsies requiring invasive surgery or euthanasia of the animals, plus slicing and staining protocols. However, there are currently accessible imaging technologies such as Optical Coherence Tomography (OCT), which allows non-invasive visualization of the main histological structures of the skin (like stratum corneum, epidermis, and dermis) and assessment of the dynamics of the pathology or efficacy of new treatments. Briefly, female immunocompetent hairless mice (SKH1 strain) were sensitized and challenged topically on back and ears for about 4 weeks. Back skin and ears thickness were measured using calliper at 3 occasions per week in complement to a macroscopic evaluation of atopic dermatitis lesions on back: erythema, scaling and excoriations scoring. In addition, OCT was performed on the back and ears of animals. OCT allows a virtual in-depth section (tomography) of the imaged organ to be made using a laser, a camera and image processing software allowing fast, non-contact and non-denaturing acquisitions of the explored tissues. To perform the imaging sessions, the animals were anesthetized with isoflurane, placed on a support under the OCT for a total examination time of 5 to 10 minutes. The results show a good correlation of the OCT technique with classical HES histology for skin lesions structures such as hyperkeratosis, epidermal hyperplasia, and dermis thickness. This OCT imaging technique can, therefore, be used in live animals at different times for longitudinal evaluation by repeated measurements of lesions in the same animals, in addition to the classical histological evaluation. Furthermore, this original imaging technique speeds up research protocols, reduces the number of animals and refines the use of the laboratory animal.Keywords: atopic dermatitis, mouse model, oxzolone model, histology, imaging
Procedia PDF Downloads 1301365 Carrying Capacity Estimation for Small Hydro Plant Located in Torrential Rivers
Authors: Elena Carcano, James Ball, Betty Tiko
Abstract:
Carrying capacity refers to the maximum population that a given level of resources can sustain over a specific period. In undisturbed environments, the maximum population is determined by the availability and distribution of resources, as well as the competition for their utilization. This information is typically obtained through long-term data collection. In regulated environments, where resources are artificially modified, populations must adapt to changing conditions, which can lead to additional challenges due to fluctuations in resource availability over time and throughout development. An example of this is observed in hydropower plants, which alter water flow and impact fish migration patterns and behaviors. To assess how fish species can adapt to these changes, specialized surveys are conducted, which provide valuable information on fish populations, sample sizes, and density before and after flow modifications. In such situations, it is highly recommended to conduct hydrological and biological monitoring to gain insight into how flow reductions affect species adaptability and to prevent unfavorable exploitation conditions. This analysis involves several planned steps that help design appropriate hydropower production while simultaneously addressing environmental needs. Consequently, the study aims to strike a balance between technical assessment, biological requirements, and societal expectations. Beginning with a small hydro project that requires restoration, this analysis focuses on the lower tail of the Flow Duration Curve (FDC), where both hydrological and environmental goals can be met. The proposed approach involves determining the threshold condition that is tolerable for the most vulnerable species sampled (Telestes Muticellus) by identifying a low flow value from the long-term FDC. The results establish a practical connection between hydrological and environmental information and simplify the process by establishing a single reference flow value that represents the minimum environmental flow that should be maintained.Keywords: carrying capacity, fish bypass ladder, long-term streamflow duration curve, eta-beta method, environmental flow
Procedia PDF Downloads 401364 An Assessment of Female Representation in Philippine Cinema in Comparison to American Cinema (1975 to 2020)
Authors: Amanda Julia Binay, Patricia Elise Suarez
Abstract:
Female representation in media is an important subject in the discussion of gender equality, especially in impactful and influential media like film. As the Filipino film industry continues to grow and evolve, the need for analysis on Filipino female representation on screen is imperative. Additionally, there has been limited research made on female representation in the Philippine film scene. Thus, the paper aims to analyze the presence and evolution of female representation in Philippine cinema and compare the findings with that of American films to see how Filipino filmmakers hold their own against the standards of international movements that call for more and better female representation, especially in Hollywood. The participants selected were Filipino and American films released within the years 1975 to 2020 in five (5) year intervals. Twenty (20) critically acclaimed and highest-grossing Filipino films and twenty (20) critically acclaimed and highest-grossing Hollywood films were then subject to the Bechdel and Peirce tests to obtain statistical measures of their female representation. The findings of the study reveal that the presence of female representation in Philippine film history has been consistent and has continued to grow and evolve throughout the years, with strong female leads with vibrant characteristics and diverse stories. However, analysis of female representation regarding American films has shown an extreme lack thereof with more misogynistic, sexist, and limiting ideals. Thus, the study concludes that the state of female representation in Philippine cinema and film industry holds its own when compared to American cinema and film industry and even outperforms it in many aspects of female representation, such as consistent inclusion and depiction of multi-dimensional female leads and female relationships. Hence, the study implies that women’s consistent presence in Philippine cinema mirrors Filipino women’s prominent role in Philippine society and that American cinema must continue to make efforts to change their portrayals of female characters, leads, and relationships to make them more grounded in reality.Keywords: female representation, gender studies, feminism, philippine cinema, American cinema, bechdel test, peirce test, comparative analysis
Procedia PDF Downloads 3781363 Investigating Early Markers of Alzheimer’s Disease Using a Combination of Cognitive Tests and MRI to Probe Changes in Hippocampal Anatomy and Functionality
Authors: Netasha Shaikh, Bryony Wood, Demitra Tsivos, Michael Knight, Risto Kauppinen, Elizabeth Coulthard
Abstract:
Background: Effective treatment of dementia will require early diagnosis, before significant brain damage has accumulated. Memory loss is an early symptom of Alzheimer’s disease (AD). The hippocampus, a brain area critical for memory, degenerates early in the course of AD. The hippocampus comprises several subfields. In contrast to healthy aging where CA3 and dentate gyrus are the hippocampal subfields with most prominent atrophy, in AD the CA1 and subiculum are thought to be affected early. Conventional clinical structural neuroimaging is not sufficiently sensitive to identify preferential atrophy in individual subfields. Here, we will explore the sensitivity of new magnetic resonance imaging (MRI) sequences designed to interrogate medial temporal regions as an early marker of Alzheimer’s. As it is likely a combination of tests may predict early Alzheimer’s disease (AD) better than any single test, we look at the potential efficacy of such imaging alone and in combination with standard and novel cognitive tasks of hippocampal dependent memory. Methods: 20 patients with mild cognitive impairment (MCI), 20 with mild-moderate AD and 20 age-matched healthy elderly controls (HC) are being recruited to undergo 3T MRI (with sequences designed to allow volumetric analysis of hippocampal subfields) and a battery of cognitive tasks (including Paired Associative Learning from CANTAB, Hopkins Verbal Learning Test and a novel hippocampal-dependent abstract word memory task). AD participants and healthy controls are being tested just once whereas patients with MCI will be tested twice a year apart. We will compare subfield size between groups and correlate subfield size with cognitive performance on our tasks. In the MCI group, we will explore the relationship between subfield volume, cognitive test performance and deterioration in clinical condition over a year. Results: Preliminary data (currently on 16 participants: 2 AD; 4 MCI; 9 HC) have revealed subfield size differences between subject groups. Patients with AD perform with less accuracy on tasks of hippocampal-dependent memory, and MCI patient performance and reaction times also differ from healthy controls. With further testing, we hope to delineate how subfield-specific atrophy corresponds with changes in cognitive function, and characterise how this progresses over the time course of the disease. Conclusion: Novel sequences on a MRI scanner such as those in route in clinical use can be used to delineate hippocampal subfields in patients with and without dementia. Preliminary data suggest that such subfield analysis, perhaps in combination with cognitive tasks, may be an early marker of AD.Keywords: Alzheimer's disease, dementia, memory, cognition, hippocampus
Procedia PDF Downloads 5721362 An Investigation on Opportunities and Obstacles on Implementation of Building Information Modelling for Pre-fabrication in Small and Medium Sized Construction Companies in Germany: A Practical Approach
Authors: Nijanthan Mohan, Rolf Gross, Fabian Theis
Abstract:
The conventional method used in the construction industries often resulted in significant rework since most of the decisions were taken onsite under the pressure of project deadlines and also due to the improper information flow, which results in ineffective coordination. However, today’s architecture, engineering, and construction (AEC) stakeholders demand faster and accurate deliverables, efficient buildings, and smart processes, which turns out to be a tall order. Hence, the building information modelling (BIM) concept was developed as a solution to fulfill the above-mentioned necessities. Even though BIM is successfully implemented in most of the world, it is still in the early stages in Germany, since the stakeholders are sceptical of its reliability and efficiency. Due to the huge capital requirement, the small and medium-sized construction companies are still reluctant to implement BIM workflow in their projects. The purpose of this paper is to analyse the opportunities and obstacles to implementing BIM for prefabrication. Among all other advantages of BIM, pre-fabrication is chosen for this paper because it plays a vital role in creating an impact on time as well as cost factors of a construction project. The positive impact of prefabrication can be explicitly observed by the project stakeholders and participants, which enables the breakthrough of the skepticism factor among the small scale construction companies. The analysis consists of the development of a process workflow for implementing prefabrication in building construction, followed by a practical approach, which was executed with two case studies. The first case study represents on-site prefabrication, and the second was done for off-site prefabrication. It was planned in such a way that the first case study gives a first-hand experience for the workers at the site on the BIM model so that they can make much use of the created BIM model, which is a better representation compared to the traditional 2D plan. The main aim of the first case study is to create a belief in the implementation of BIM models, which was succeeded by the execution of offshore prefabrication in the second case study. Based on the case studies, the cost and time analysis was made, and it is inferred that the implementation of BIM for prefabrication can reduce construction time, ensures minimal or no wastes, better accuracy, less problem-solving at the construction site. It is also observed that this process requires more planning time, better communication, and coordination between different disciplines such as mechanical, electrical, plumbing, architecture, etc., which was the major obstacle for successful implementation. This paper was carried out in the perspective of small and medium-sized mechanical contracting companies for the private building sector in Germany.Keywords: building information modelling, construction wastes, pre-fabrication, small and medium sized company
Procedia PDF Downloads 1101361 Covid Medical Imaging Trial: Utilising Artificial Intelligence to Identify Changes on Chest X-Ray of COVID
Authors: Leonard Tiong, Sonit Singh, Kevin Ho Shon, Sarah Lewis
Abstract:
Investigation into the use of artificial intelligence in radiology continues to develop at a rapid rate. During the coronavirus pandemic, the combination of an exponential increase in chest x-rays and unpredictable staff shortages resulted in a huge strain on the department's workload. There is a World Health Organisation estimate that two-thirds of the global population does not have access to diagnostic radiology. Therefore, there could be demand for a program that could detect acute changes in imaging compatible with infection to assist with screening. We generated a conventional neural network and tested its efficacy in recognizing changes compatible with coronavirus infection. Following ethics approval, a deidentified set of 77 normal and 77 abnormal chest x-rays in patients with confirmed coronavirus infection were used to generate an algorithm that could train, validate and then test itself. DICOM and PNG image formats were selected due to their lossless file format. The model was trained with 100 images (50 positive, 50 negative), validated against 28 samples (14 positive, 14 negative), and tested against 26 samples (13 positive, 13 negative). The initial training of the model involved training a conventional neural network in what constituted a normal study and changes on the x-rays compatible with coronavirus infection. The weightings were then modified, and the model was executed again. The training samples were in batch sizes of 8 and underwent 25 epochs of training. The results trended towards an 85.71% true positive/true negative detection rate and an area under the curve trending towards 0.95, indicating approximately 95% accuracy in detecting changes on chest X-rays compatible with coronavirus infection. Study limitations include access to only a small dataset and no specificity in the diagnosis. Following a discussion with our programmer, there are areas where modifications in the weighting of the algorithm can be made in order to improve the detection rates. Given the high detection rate of the program, and the potential ease of implementation, this would be effective in assisting staff that is not trained in radiology in detecting otherwise subtle changes that might not be appreciated on imaging. Limitations include the lack of a differential diagnosis and application of the appropriate clinical history, although this may be less of a problem in day-to-day clinical practice. It is nonetheless our belief that implementing this program and widening its scope to detecting multiple pathologies such as lung masses will greatly assist both the radiology department and our colleagues in increasing workflow and detection rate.Keywords: artificial intelligence, COVID, neural network, machine learning
Procedia PDF Downloads 921360 Performance Assessment of Horizontal Axis Tidal Turbine with Variable Length Blades
Authors: Farhana Arzu, Roslan Hashim
Abstract:
Renewable energy is the only alternative sources of energy to meet the current energy demand, healthy environment and future growth which is considered essential for essential sustainable development. Marine renewable energy is one of the major means to meet this demand. Turbines (both horizontal and vertical) play a vital role for extraction of tidal energy. The influence of swept area on the performance improvement of tidal turbine is a vital factor to study for the reduction of relatively high power generation cost in marine industry. This study concentrates on performance investigation of variable length blade tidal turbine concept that has already been proved as an efficient way to improve energy extraction in the wind industry. The concept of variable blade length utilizes the idea of increasing swept area through the turbine blade extension when the tidal stream velocity falls below the rated condition to maximize energy capture while blade retracts above rated condition. A three bladed horizontal axis variable length blade horizontal axis tidal turbine was modelled by modifying a standard fixed length blade turbine. Classical blade element momentum theory based numerical investigation has been carried out using QBlade software to predict performance. The results obtained from QBlade were compared with the available published results and found very good agreement. Three major performance parameters (i.e., thrust, moment, and power coefficients) and power output for different blade extensions were studied and compared with a standard fixed bladed baseline turbine at certain operational conditions. Substantial improvement in performance coefficient is observed with the increase in swept area of the turbine rotor. Power generation is found to increase in great extent when operating at below rated tidal stream velocity reducing the associated cost per unit electric power generation.Keywords: variable length blade, performance, tidal turbine, power generation
Procedia PDF Downloads 2751359 Toxicological and Histopathological Studies on the Effect of Tartrazine in Male Albino Rats
Authors: F. Alaa Ali, S. A. Sherein Abdelgayed, S. Osama. EL-Tawil, M. Adel Bakeer
Abstract:
Tartrazine is an organic azo dyes food additive widely used in foods, drugs, and cosmetics. The present study aimed to investigate the toxic effects of tartrazine on kidneys and liver biomarkers in addition to the investigation of oxidative stress and change of histopathological structure of liver and kidneys in 30 male rats. Tartrazine was orally administrated daily at dose 200 mg/ kg bw (1/ 10 LD50) for sixty days. Serum and tissue samples were collected at the end of the experiment to investigate the underlying mechanism of tartrazine through assessment oxidative stress (Glutathione (GSH), Superoxide dismutase (SOD) and malondialdehyde (MDA) and biochemical markers (alanine aminotransferase (ALT), aspartate aminotransferase (AST), Total protein and Urea). Liver and kidneys tissue were collected and preserved in 10% formalin for histopathological examination. The obtained values were statistically analyzed by one way analysis of variance (ANOVA) followed by multiple comparison test. Biochemical analysis revealed that tartrazine induced significant increase in serum ALT, AST, total protein, urea level compared to control group. Tartrazine showed significant decrease in liver GSH and SOD where their values when compared to control group. Tartrazine induced increase in liver MDA compared to control group. Histopathology of the liver showed diffuse vacuolar degeneration in hepatic parenchyma, the portal area showed sever changes sever in hepatoportal blood vessels and in the bile ducts. The kidneys showed degenerated tubules at the cortex together with mononuclear leucocytes inflammatory cells infiltration. There is perivascular edema with inflammatory cell infiltration surrounding the congested and hyalinized vascular wall of blood vessel. The present study indicates that the subchronic effects of tartrazine have a toxic effect on the liver and kidneys together with induction of oxidative stress by formation of free radicals. Therefore, people should avoid the hazards of consuming tartrazine.Keywords: albino rats, tartrazine, toxicity, pathology
Procedia PDF Downloads 3551358 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling
Authors: Vibha Devi, Shabina Khanam
Abstract:
Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation
Procedia PDF Downloads 1391357 Assessment of Water Quality of Euphrates River at Babylon Governorate, for Drinking, Irrigation and general, Using Water Quality Index (Canadian Version) (CCMEWQI)
Authors: Amer Obaid Saud
Abstract:
Water quality index (WQI) is considered as an effective tool in categorization of water resources for its quality and suitability for different uses. The Canadian version of water quality index (CCME WQI) which based on the comparison of the water quality parameters to regulatory standards and give a single value to the water quality of a source was applied in this study to assess the water quality of Euphrates river in Iraq at Babylon Governorate north of Baghdad and determine its suitability for aquatic environment (GWQI), drinking water (PWSI) and irrigation(IWQI). Five stations were selected on the river in Babylon (Euphrates River/AL-Musiab, Hindia barrage, two stations at Hilla city and the fifth station at Al-Hshmeya north of Hilla. Fifteen water samples were collected every month during August 2013 to July 2014 at the study sites and analyzed for the physico-chemical parameters like (Temperature, pH, Electrical Conductivity, Total Dissolved Solids(TDS), Total Suspended Solids(TSS), Total Alkalinity, Total Hardness, Calcium and Magnesium Concentration, some of nutrient like Nitrite, Nitrate, Phosphate also the study of concentration of some heavy metals (Fe, Pb, Zn, Cu, Mn, and Cd) in water and comparison of measures to benchmarks such as guidelines and objectives to assess change in water quality. The result of Canadian version of(CCME .WQI) to assess the irrigation water quality (IWQI) of Euphrates river was (83-good) at site one during second seasonal period while the lowest was (66-Fair) in the second station during the fourth seasonal period, the values of potable water supply index (PWSI)that the highest value was (68-Fair) in the fifth site during the second period while the lowest value (42 -Poor) in the second site during the first seasonal period,the highest value for general water quality (GWQI) was (74-Fair) in site five during the second seasonal period, the lowest value (48-Marginal) in the second site during the first seasonal period. It was observed that the main cause of deterioration in water quality was due to the lack of, unprotected river sites ,high anthropogenic activities and direct discharge of industrial effluent.Keywords: Babylon governorate, Canadian version, water quality, Euphrates river
Procedia PDF Downloads 3971356 Carbonaceous Monolithic Multi-Channel Denuders as a Gas-Particle Partitioning Tool for the Occupational Sampling of Aerosols from Semi-Volatile Organic Compounds
Authors: Vesta Kohlmeier, George C. Dragan, Juergen Orasche, Juergen Schnelle-Kreis, Dietmar Breuer, Ralf Zimmermann
Abstract:
Aerosols from hazardous semi-volatile organic compounds (SVOC) may occur in workplace air and can simultaneously be found as particle and gas phase. For health risk assessment, it is necessary to collect particles and gases separately. This can be achieved by using a denuder for the gas phase collection, combined with a filter and an adsorber for particle collection. The study focused on the suitability of carbonaceous monolithic multi-channel denuders, so-called Novacarb™-Denuders (MastCarbon International Ltd., Guilford, UK), to achieve gas-particle separation. Particle transmission efficiency experiments were performed with polystyrene latex (PSL) particles (size range 0.51-3 µm), while the time dependent gas phase collection efficiency was analysed for polar and nonpolar SVOC (mass concentrations 7-10 mg/m3) over 2 h at 5 or 10 l/min. The experimental gas phase collection efficiency was also compared with theoretical predictions. For n-hexadecane (C16), the gas phase collection efficiency was max. 91 % for one denuder and max. 98 % for two denuders, while for diethylene glycol (DEG), a maximal gas phase collection efficiency of 93 % for one denuder and 97 % for two denuders was observed. At 5 l/min higher gas phase collection efficiencies were achieved than at 10 l/min. The deviations between the theoretical and experimental gas phase collection efficiencies were up to 5 % for C16 and 23 % for DEG. Since the theoretical efficiency depends on the geometric shape and length of the denuder, flow rate and diffusion coefficients of the tested substances, the obtained values define an upper limit which could be reached. Regarding the particle transmission through the denuders, the use of one denuder showed transmission efficiencies around 98 % for 1-3 µm particle diameters. The use of three denuders resulted in transmission efficiencies from 93-97 % for the same particle sizes. In summary, NovaCarb™-Denuders are well applicable for sampling aerosols of polar/nonpolar substances with particle diameters ≤3 µm and flow rates of 5 l/min or lower. These properties and their compact size make them suitable for use in personal aerosol samplers. This work is supported by the German Social Accident Insurance (DGUV), research contract FP371.Keywords: gas phase collection efficiency, particle transmission, personal aerosol sampler, SVOC
Procedia PDF Downloads 1751355 Determinants of Post-Psychotic Depression in Schizophrenia Patients in ACSH and Mekellle Hospital Tigray, Ethiopia, 2019
Authors: Ashenafi Ayele, Shewit Haftu, Tesfalem Araya
Abstract:
Background: “Post-psychotic depression”, “post schizophrenic depression”, and “secondary depression” have been used to describe the occurrence of depressive symptoms during the chronic phase of schizophrenia. Post-psychotic depression is the most common cause of death due to suicide in schizophrenia patients. Overall lifetime risk for patients with schizophrenia is 50% for suicide attempts and 9-13% lifetime risk for completed suicide and also it is associated with poor prognosis and poor quality of life. Objective: To assess determinant of post psychotic depression in schizophrenia patients ACSH and Mekelle General Hospital, Tigray Ethiopia 2019. Methods: An institutional based unmatched case control study was conducted among 69 cases and 138 controls with the ratio of case to control 1 ratio 2. The sample is calculated using epi-info 3.1 to assess the determinant factors of post-psychotic depression in schizophrenia patients. The cases were schizophrenia patients who have been diagnosed at least for more than one-year stable for two months, and the controls are any patients who are diagnosed as schizophrenia patients. Study subjects were selected using a consecutive sampling technique. The Calgary depression scale for schizophrenia self-administered questionnaire was used. Before the interview, it was assessed the client’s capacity to give intended information using a scale called the University of California, San Diego Brief Assessment of Capacity to Consent (UBACC). Bivariant and multiple Logistic regression analysis was performed to determine between the independent and dependent variables. The significant independent predictor was declared at 95% confidence interval and P-value of less than 0.05. Result: Females were affected by post psychotic depression with the (AOR=2.01, 95%CI: 1.003- 4.012, P= 0.49).Patients who have mild form of positive symptom of schizophrenia affected by post psychotic depression with (AOR =4.05, 95%CI: 1.888- 8.7.8, P=0001).Patients who have minimal form of negative symptom of schizophrenia are affected by post psychotic depression with (AOR =4.23, 95%CI: 1.081-17.092, P=.038). Conclusion: In this study, sex (female) and presence of positive and negative symptoms of schizophrenia were significantly associated. It is recommended that the post psychotic depression should be assessed in every schizophrenia patient to decrease the severity of illness, and to improve patient’s quality of life.Keywords: determinants, post-psychotic depression, Mekelle city
Procedia PDF Downloads 1201354 Neonatal Subcutaneous Fat Necrosis with Severe Hypercalcemia: Case Report
Authors: Atitallah Sofien, Bouyahia Olfa, Krifi farah, Missaoui Nada, Ben Rabeh Rania, Yahyaoui Salem, Mazigh Sonia, Boukthir Samir
Abstract:
Introduction: Subcutaneous fat necrosis of the newborn (SCFN) is a rare acute hypodermatitis characterized by skin lesions in the form of infiltrated, hard plaques and subcutaneous nodules, with a purplish-red color, occurring between the first and sixth week of life. SCFN is generally a benign condition that spontaneously regresses without sequelae, but it can be complicated by severe hypercalcemia. Methodology: This is a retrospective case report of neonatal subcutaneous fat necrosis complicated with severe hypercalcemia and nephrocalcinosis. Results: This is a case of a female newborn with a family history of a hypothyroid mother on Levothyrox, born to non-consanguineous parents and from a well-monitored pregnancy. The newborn was delivered by cesarean section at 39 weeks gestation due to severe preeclampsia. She was admitted to the Neonatal Intensive Care Unit at 1 hour of life for the management of grade 1 perinatal asphyxia and immediate neonatal respiratory distress related to transient respiratory distress. Hospitalization was complicated by a healthcare-associated infection, requiring intravenous antibiotics for ten days, with a good clinical and biological response. On the 20th day of life, she developed skin lesions in the form of indurated purplish-red nodules on the back and on both arms. A SCFN was suspected. A calcium level test was conducted, which returned a result of 3 mmol/L. The rest of the phosphocalcic assessment was normal, with early signs of nephrocalcinosis observed on renal ultrasound. The diagnosis of SCFN complicated by nephrocalcinosis associated with severe hypercalcemia was made, and the condition improved with intravenous hydration and corticosteroid therapy. Conclusion: SCFN is a rare and generally benign hypodermatitis in newborns with an etiology that is still poorly understood. Despite its benign nature, SCFN can be complicated by hypercalcemia, which can sometimes be life-threatening. Therefore, it is important to conduct a thorough skin examination of newborns, especially those with risk factors, to detect and correct any potential hypercalcemia.Keywords: subcutaneous fat necrosis, newborn, hypercalcemia, nephrocalcinosis
Procedia PDF Downloads 561353 Explore and Reduce the Performance Gap between Building Modelling Simulations and the Real World: Case Study
Authors: B. Salehi, D. Andrews, I. Chaer, A. Gillich, A. Chalk, D. Bush
Abstract:
With the rapid increase of energy consumption in buildings in recent years, especially with the rise in population and growing economies, the importance of energy savings in buildings becomes more critical. One of the key factors in ensuring energy consumption is controlled and kept at a minimum is to utilise building energy modelling at the very early stages of the design. So, building modelling and simulation is a growing discipline. During the design phase of construction, modelling software can be used to estimate a building’s projected energy consumption, as well as building performance. The growth in the use of building modelling software packages opens the door for improvements in the design and also in the modelling itself by introducing novel methods such as building information modelling-based software packages which promote conventional building energy modelling into the digital building design process. To understand the most effective implementation tools, research projects undertaken should include elements of real-world experiments and not just rely on theoretical and simulated approaches. Upon review of the related studies undertaken, it’s evident that they are mostly based on modelling and simulation, which can be due to various reasons such as the more expensive and time-consuming nature of real-time data-based studies. Taking in to account the recent rise of building energy software modelling packages and the increasing number of studies utilising these methods in their projects and research, the accuracy and reliability of these modelling software packages has become even more crucial and critical. This Energy Performance Gap refers to the discrepancy between the predicted energy savings and the realised actual savings, especially after buildings implement energy-efficient technologies. There are many different software packages available which are either free or have commercial versions. In this study, IES VE (Integrated Environmental Solutions Virtual Environment) is used as it is a common Building Energy Modeling and Simulation software in the UK. This paper describes a study that compares real time results with those in a virtual model to illustrate this gap. The subject of the study is a north west facing north-west (345°) facing, naturally ventilated, conservatory within a domestic building in London is monitored during summer to capture real-time data. Then these results are compared to the virtual results of IES VE, which is a commonly used building energy modelling and simulation software in the UK. In this project, the effect of the wrong position of blinds on overheating is studied as well as providing new evidence of Performance Gap. Furthermore, the challenges of drawing the input of solar shading products in IES VE will be considered.Keywords: building energy modelling and simulation, integrated environmental solutions virtual environment, IES VE, performance gap, real time data, solar shading products
Procedia PDF Downloads 137