Search results for: nonprofit organizations-national data maturity index (NDI)
23610 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area
Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras
Abstract:
Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.Keywords: empirical models, RMSE, shear wave velocity, standard penetration test
Procedia PDF Downloads 33823609 Evaluation of Triage Performance: Nurse Practice and Problem Classifications
Authors: Atefeh Abdollahi, Maryam Bahreini, Babak Choobi Anzali, Fatemeh Rasooli
Abstract:
Introduction: Triage becomes the main part of organization of care in Emergency department (ED)s. It is used to describe the sorting of patients for treatment priority in ED. The accurate triage of injured patients has reduced fatalities and improved resource usage. Besides, the nurses’ knowledge and skill are important factors in triage decision-making. The ability to define an appropriate triage level and their need for intervention is crucial to guide to a safe and effective emergency care. Methods: This is a prospective cross-sectional study designed for emergency nurses working in four public university hospitals. Five triage workshops have been conducted every three months for emergency nurses based on a standard triage Emergency Severity Index (ESI) IV slide set - approved by Iranian Ministry of Health. Most influential items on triage performance were discussed through brainstorming in workshops which then, were peer reviewed by five emergency physicians and two head registered nurses expert panel. These factors that might distract nurse’ attention from proper decisions included patients’ past medical diseases, the natural tricks of triage and system failure. After permission had been taken, emergency nurses participated in the study and were given the structured questionnaire. Data were analysed by SPSS 21.0. Results: 92 emergency nurses enrolled in the study. 30 % of nurses reported the past history of chronic disease as the most influential confounding factor to ascertain triage level, other important factors were the history of prior admission, past history of myocardial infarction and heart failure to be 20, 17 and 11 %, respectively. Regarding the concept of difficulties in triage practice, 54.3 % reported that the discussion with patients and family members was difficult and 8.7 % declared that it is hard to stay in a single triage room whole day. Among the participants, 45.7 and 26.1 % evaluated the triage workshops as moderately and highly effective, respectively. 56.5 % reported overcrowding as the most important system-based difficulty. Nurses were mainly doubtful to differentiate between the triage levels 2 and 3 according to the ESI VI system. No significant correlation was found between the work record of nurses in triage and the uncertainty in determining the triage level and difficulties. Conclusion: The work record of nurses hardly seemed to be effective on the triage problems and issues. To correct the deficits, training workshops should be carried out, followed by continuous refresher training and supportive supervision.Keywords: assessment, education, nurse, triage
Procedia PDF Downloads 23323608 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites
Authors: Che-Wei Lee, Bay-Erl Lai
Abstract:
A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.Keywords: steganography, data hiding, secret authentication, secret sharing
Procedia PDF Downloads 24323607 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications
Authors: T. Gangadhararao, K. Krishna Kishore
Abstract:
Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code
Procedia PDF Downloads 43223606 3 Phase Induction Motor Control Using Single Phase Input and GSM
Authors: Pooja S. Billade, Sanjay S. Chopade
Abstract:
This paper focuses on the design of three phase induction motor control using single phase input and GSM.The controller used in this work is a wireless speed control using a GSM technique that proves to be very efficient and reliable in applications.The most common principle is the constant V/Hz principle which requires that the magnitude and frequency of the voltage applied to the stator of a motor maintain a constant ratio. By doing this, the magnitude of the magnetic field in the stator is kept at an approximately constant level throughout the operating range. Thus, maximum constant torque producing capability is maintained. The energy that a switching power converter delivers to a motor is controlled by Pulse Width Modulated signals applied to the gates of the power transistors in H-bridge configuration. PWM signals are pulse trains with fixed frequency and magnitude and variable pulse width. When a PWM signal is applied to the gate of a power transistor, it causes the turn on and turns off intervals of the transistor to change from one PWM period.Keywords: index terms— PIC, GSM (global system for mobile), LCD (Liquid Crystal Display), IM (Induction Motor)
Procedia PDF Downloads 44823605 Splenic Artery Aneurysms: A Rare, Insidious Cause of Abdominal Pain
Authors: Christopher Oyediran, Nicola Ubayasiri, Christopher Gough
Abstract:
Splenic artery aneurysms are often clinically occult, occasionally identified incidentally with imaging. The pathogenesis of aneurysms is complex, but certain factors are thought to contribute to their development. Given the potential fatal complications of rupture, a high index of suspicion is required to make an early diagnosis. We present a case of a 36-year-old female with a history of endometriosis and multiple sclerosis who presented to the Emergency Department with sudden onset epigastric pain and collapse. On arrival, she was pale and clammy with profound tachycardia and hypotension. An ultrasound done in the resuscitation department revealed abdominal free fluid. She was resuscitated with blood and transferred for emergent laparotomy. Laparotomy revealed massive haemoperitoneum from the spleen. She underwent emergency splenectomy and inspection of the spleen revealed a splenic artery aneurysm. She received our massive transfusion protocol followed by a short stay on ITU, making a good post-operative recovery and was discharged home a week later.Keywords: aneurysm, human chorionic gonadotrophin (hCG), resuscitation, laparotomy
Procedia PDF Downloads 43123604 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base
Authors: Omid Khodabakhshi, Amir Rozdel
Abstract:
The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.Keywords: code, cloud computing, security, virtual machines
Procedia PDF Downloads 19123603 Conceptual Perimeter Model for Estimating Building Envelope Quantities
Authors: Ka C. Lam, Oluwafunmibi S. Idowu
Abstract:
Building girth is important in building economics and mostly used in quantities take-off of various cost items. Literature suggests that the use of conceptual quantities can improve the accuracy of cost models. Girth or perimeter of a building can be used to estimate conceptual quantities. Hence, the current paper aims to model the perimeter-area function of buildings shapes for use at the conceptual design stage. A detailed literature review on existing building shape indexes was carried out. An empirical approach was used to study the relationship between area and the shortest length of a four-sided orthogonal polygon. Finally, a mathematical approach was used to establish the observed relationships. The empirical results obtained were in agreement with the mathematical model developed. A new equation termed “conceptual perimeter equation” is proposed. The equation can be used to estimate building envelope quantities such as external wall area, external finishing area and scaffolding area before sketch or detailed drawings are prepared.Keywords: building envelope, building shape index, conceptual quantities, cost modelling, girth
Procedia PDF Downloads 34323602 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran
Authors: Rojin Bana Derakhshan, Abbas Toloie
Abstract:
For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.Keywords: energy saving, key elements of success, optimization of energy consumption, data mining
Procedia PDF Downloads 46823601 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach
Authors: Kwaku Damoah
Abstract:
This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis
Procedia PDF Downloads 4823600 Comparative Study Between Continuous Versus Pulsed Ultrasound in Knee Osteoarthritis
Authors: Karim Mohamed Fawzy Ghuiba, Alaa Aldeen Abd Al Hakeem Balbaa, Shams Elbaz
Abstract:
Objectives: To compare between the effects continuous and pulsed ultrasound on pain and function in patient with knee osteoarthritis. Design: Randomized-Single blinded Study. Participants: 6 patients with knee osteoarthritis with mean age 53.66±3.61years, Altman Grade II or III. Interventions: Subjects were randomly assigned into two groups; Group A received continuous ultrasound and Group B received pulsed ultrasound. Outcome measures: Effects of pulsed and continuous ultrasound were evaluated by pain threshold assessed by visual analogue scale (VAS) scores and function assessed by the Western Ontario and McMaster Universities osteoarthritis index (WOMAC) scores. Results: There was no significant decrease in VAS and WOMAC scores in patients treated with pulsed or continuous ultrasound; and there were no significant differences between both groups. Conclusion: there is no difference between the effects of pulsed and continuous ultrasound in pain relief or functional outcome in patients with knee osteoarthritis.Keywords: knee osteoarthritis, pulsed ultrasound, ultrasound therapy, continuous ultrasound
Procedia PDF Downloads 28523599 Effect of Dietary Cellulose Levels on the Growth Parameters of Nile Tilapia Oreochromis Niloticus Fingerlings
Authors: Keri Alhadi Ighwela, Aziz Bin Ahmad, A. B. Abol-Munafi
Abstract:
Three purified diets were formulated using fish meal, soya bean, wheat flour, palm oil, minerals and maltose. The carbohydrate in the diets was increased from 5 to 15% by changing the cellulose content to study the effect of dietary carbohydrate level on the growth parameters of Nile tilapia Oreochromis niloticus.The protein and the lipid contents were kept constant in all the diets. The results showed that, weight gain, protein efficiency ratio, net protein utilisation and hepatosomatic index of fish fed the diet containing 15% cellulose were the lowest among all groups. Addition, the fish fed the diet containing 5% cellulose had the best specific growth rate, and food conversion ratio. While, there was no effect of the dietary cellulose levels on condition factor and survival rate. These results indicate that Nile tilapia fingerlings are able to utilize dietary cellulose does not exceed 10% in their feed for optimum growth.Keywords: dietary cellulose, growth parameters, oreochromis niloticus, purified diets
Procedia PDF Downloads 51123598 Microbial Quality of Beef and Mutton in Bauchi Metropolis
Authors: Abdullahi Mohammed
Abstract:
The microbial quality of beef and mutton sold in four major markets of Bauchi metropolis was assessed in order to assist in ascertaining safety. Shops were selected from 'Muda Lawal', 'Yelwa', 'Wunti', and 'Gwallameji' markets. The total bacterial count was used as index of quality. A total of thirty two (32) samples were collected in two successive visits. The samples were packed and labelled in a sterile polythene bags for transportation to the laboratory. Microbial analysis was carried out immediately upon arrival under a septic condition, where aerobic plate was used in determining the microbial load. Result showed that beef and mutton from Gwallameji had the highest bacterial count of 9.065 X 105 cfu/ml and 8.325 X 105 cfu/ml for beef and mutton respectively followed by Wunti market (6.95 X 105 beef and 4.838 X 105 motton) and Muda Lawal (4.86 X 105 cfu/ml beef and 5.998 X 105 cfu/ml mutton). Yelwa had 5.175 X 105 and 5.30 X 105 for beef and mutton respectively. Bacterial species isolated from the samples were Escherichia coli, Salmonella spp, Streptococcus species and Staphylococcus species. However, results obtained from all markets showed that there was no significant differences between beef and mutton in terms of microbial quality.Keywords: beef, mutton, salmonella, sterile
Procedia PDF Downloads 46023597 Genetic Diversity Analysis in Triticum Aestivum Using Microsatellite Markers
Authors: Prachi Sharma, Mukesh Kumar Rana
Abstract:
In the present study, the simple sequence repeat(SSR) markers have been used in analysis of genetic diversity of 37 genotypes of Triticum aestivum. The DNA was extracted using cTAB method. The DNA was quantified using the fluorimeter. The annealing temperatures for 27 primer pairs were standardized using gradient PCR, out of which 16 primers gave satisfactory amplification at temperature ranging from 50-62⁰ C. Out of 16 polymorphic SSR markers only 10 SSR primer pairs were used in the study generating 34 reproducible amplicons among 37 genotypes out of which 30 were polymorphic. Primer pairs Xgwm533, Xgwm 160, Xgwm 408, Xgwm 120, Xgwm 186, Xgwm 261 produced maximum percent of polymorphic bands (100%). The bands ranged on an average of 3.4 bands per primer. The genetic relationship was determined using Jaccard pair wise similarity co-efficient and UPGMA cluster analysis with NTSYS Pc.2 software. The values of similarity index range from 0-1. The similarity coefficient ranged from 0.13 to 0.97. A minimum genetic similarity (0.13) was observed between VL 804 and HPW 288, meaning they are only 13% similar. More number of available SSR markers can be useful for supporting the genetic diversity analysis in the above wheat genotypes.Keywords: wheat, genetic diversity, microsatellite, polymorphism
Procedia PDF Downloads 61423596 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures
Authors: Salima Kouici, Abdelkader Khelladi
Abstract:
In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering
Procedia PDF Downloads 48223595 Climate Change and Its Effects on Terrestrial Insect Diversity in Mukuruthi National Park, Nilgiri Biosphere Reserve, Tamilnadu, India
Authors: M. Elanchezhian, C. Gunasekaran, A. Agnes Deepa, M. Salahudeen
Abstract:
In recent years climate change is one of the most emerging threats facing by biodiversity both the animals and plants species. Elevated carbon dioxide and ozone concentrations, extreme temperature, changes in rainfall patterns, insects-plant interaction are the main criteria that affect biodiversity. In the present study, which emphasis the climate change and its effects on terrestrial insect diversity in Mukuruthi National Park a protected areas of Western Ghats in India. Sampling was done seasonally at the three areas using pitfall traps, over the period of January to December 2013. The statistical findings were done by Shannon wiener diversity index (H). A significant seasonal variation pattern was detected for total insect’s diversity at the different study areas. Totally nine orders of insects were recorded. Diversity and abundance of terrestrial insects shows much difference between the Natural, Shoal forest and the Grasslands.Keywords: biodiversity, climate change, mukuruthi national park, terrestrial invertebrates
Procedia PDF Downloads 51623594 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies
Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal
Abstract:
Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model
Procedia PDF Downloads 22623593 Meat Qualities and Death on Arrival (DOA) of Broiler Chickens Transported in a Brazilian Tropical Conditions
Authors: Arlan S. Freitas, Leila M. Carvalho, Adriana L. Soares, Arnoud Neto, Marta S. Madruga, Elza I. Ida, Massami Shimokomaki
Abstract:
The objective of this work was to evaluate the influence of microclimatic profile of broiler transport trucks under commercial conditions over the breast meat quality and DOA (Death On Arrival) in a tropical Brazilian regions as the North East where routinely the season is divided into dry and wet seasons. The temperature remains fairly constant and obviously the relative humidity changes accordingly. Three loads of 4,100 forty seven days old broiler were monitored from farm to slaughterhouse in a distance of 4.3 km, morning period of October 2015 rainy days. The profile of the environmental variables inside the container truck throughout the journey was obtained by the installation of thermo anemometers in 6 different locations by monitoring the heat index (HI), air velocity (AV), temperature (T), and relative humidity (RH). Meat qualities were evaluated by determining the occurrence of PSE (pale, soft, exudative) meat and DFD (dark, firm dry) meat. The percentage of birds DOA per loaded truck was determined by counting the dead broiler during the hanging step at the slaughtering plant. The analysis of variance was performed using statistical software (Statistica 8 for windows, Statsoft 2007, Tulsa, OK, USA). The Tukey significance test (P<0.05) was applied to compare means from microenvironmental data, PSE, DFD and DOA. Fillet samples were collected at 24h post mortem for pH e color (L*, a* e b*) determination through the CIELAB system. Results showed the occurrence of 2.98% of PSE and 0.66% de DFD and only 0.016% of DOA and overall the most uncomfortable container location was at the truck frontal inferior presenting 6.25% of PSE. DFD of 2.0% were obtained from birds located at central and inferior rear locations. These values were unexpected in comparison to other results obtained in our laboratories in previous experiments carried out within the country south state. The results reported herein were lower in every aspect. Reasonable explanation would be the shorter distance, wet conditions throughout around 15-20 min journeys and lower T and RH values as observed in samples taken from the rear location as higher DFD values were obtained. These facts mean the animals were not under heat stressful condition but in fact under cold stress conditions as the result of DFD suggested in association to the lower number of DOA.Keywords: cold stress, DFD, microclimatic profile, PSE
Procedia PDF Downloads 23523592 Alveolar Ridge Preservation in Post-extraction Sockets Using Concentrated Growth Factors: A Split-Mouth, Randomized, Controlled Clinical Trial
Authors: Sadam Elayah
Abstract:
Background: One of the most critical competencies in advanced dentistry is alveolar ridge preservation after exodontia. The aim of this clinical trial was to assess the impact of autologous concentrated growth factor (CGF) as a socket-filling material and its ridge preservation properties following the lower third molar extraction. Materials and Methods: A total of 60 sides of 30 participants who had completely symmetrical bilateral impacted lower third molars were enrolled. The short-term outcome variables were wound healing, swelling and pain, clinically assessed at different time intervals (1st, 3rd & 7th days). While the long-term outcome variables were bone height & width, bone density and socket surface area in the coronal section. Cone beam computed tomography images were obtained immediately after surgery and three months after surgery as a temporal measure. Randomization was achieved by opaque, sealed envelopes. Follow-up data were compared to baseline using Paired & Unpaired t-tests. Results: The wound healing index was significantly better in the test sides (P =0.001). Regarding the facial swelling, the test sides had significantly fewer values than the control sides, particularly on the 1st (1.01±.57 vs 1.55 ±.56) and 3rd days (1.42±0.8 vs 2.63±1.2) postoperatively. Nonetheless, the swelling disappeared within the 7th day on both sides. The pain scores of the visual analog scale were not a statistically significant difference between both sides on the 1st day; meanwhile, the pain scores were significantly lower on the test sides compared with the control sides, especially on the 3rd (P=0.001) and 7th days (P˂0.001) postoperatively. Regarding long-term outcomes, CGF sites had higher values in height and width when compared to Control sites (Buccal wall 32.9±3.5 vs 29.4±4.3 mm, Lingual wall 25.4±3.5 vs 23.1±4 mm, and Alveolar bone width 21.07±1.55vs19.53±1.90 mm) respectively. Bone density showed significantly higher values in CGF sites than in control sites (Coronal half 200±127.3 vs -84.1±121.3, Apical half 406.5±103 vs 64.2±158.6) respectively. There was a significant difference between both sites in reducing periodontal pockets. Conclusion: CGF application following surgical extraction provides an easy, low-cost, and efficient option for alveolar ridge preservation. Thus, dentists may encourage using CGF during dental extractions, particularly when alveolar ridge preservation is required.Keywords: platelet, extraction, impacted teeth, alveolar ridge, regeneration, CGF
Procedia PDF Downloads 6723591 Relationship of Sleep Duration with Obesity and Dietary Intake
Authors: Seyed Ahmad Hosseini, Makan Cheraghpour, Saeed Shirali, Roya Rafie, Matin Ghanavati, Arezoo Amjadi, Meysam Alipour
Abstract:
Background: There is a mutual relationship between sleep duration and obesity. We studied the relationship between sleep duration with obesity and dietary Intake. Methods: This cross-sectional study was conducted on 444 male students in Ahvaz Jundishapur University of Medical Science. Dietary intake was analyzed by food frequency questionnaire (FFQ). Anthropometric indices were analyzed. Participants were being asked about their sleep duration and they were categorized into three groups according to their responses (less than six hours, between six and eight hours, and more than eight hours). Results: Macronutrient, micronutrient, and antioxidant intake did not show significant difference between three groups. Moreover, we did not observe any significant difference between anthropometric indices (weight, body mass index, waist circumference, and percentage body fat). Conclusions: Our study results show no significant relationship between sleep duration, nutrition pattern, and obesity. Further study is recommended.Keywords: sleep duration, obesity, dietary intake, cross-sectional
Procedia PDF Downloads 34223590 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15823589 Enhancing Disaster Resilience: Advanced Natural Hazard Assessment and Monitoring
Authors: Mariza Kaskara, Stella Girtsou, Maria Prodromou, Alexia Tsouni, Christodoulos Mettas, Stavroula Alatza, Kyriaki Fotiou, Marios Tzouvaras, Charalampos Kontoes, Diofantos Hadjimitsis
Abstract:
Natural hazard assessment and monitoring are crucial in managing the risks associated with fires, floods, and geohazards, particularly in regions prone to these natural disasters, such as Greece and Cyprus. Recent advancements in technology, developed by the BEYOND Center of Excellence of the National Observatory of Athens, have been successfully applied in Greece and are now set to be transferred to Cyprus. The implementation of these advanced technologies in Greece has significantly improved the country's ability to respond to these natural hazards. For wildfire risk assessment, a scalar wildfire occurrence risk index is created based on the predictions of machine learning models. Predicting fire danger is crucial for the sustainable management of forest fires as it provides essential information for designing effective prevention measures and facilitating response planning for potential fire incidents. A reliable forecast of fire danger is a key component of integrated forest fire management and is heavily influenced by various factors that affect fire ignition and spread. The fire risk model is validated by the sensitivity and specificity metric. For flood risk assessment, a multi-faceted approach is employed, including the application of remote sensing techniques, the collection and processing of data from the most recent population and building census, technical studies and field visits, as well as hydrological and hydraulic simulations. All input data are used to create precise flood hazard maps according to various flooding scenarios, detailed flood vulnerability and flood exposure maps, which will finally produce the flood risk map. Critical points are identified, and mitigation measures are proposed for the worst-case scenario, namely, refuge areas are defined, and escape routes are designed. Flood risk maps can assist in raising awareness and save lives. Validation is carried out through historical flood events using remote sensing data and records from the civil protection authorities. For geohazards monitoring (e.g., landslides, subsidence), Synthetic Aperture Radar (SAR) and optical satellite imagery are combined with geomorphological and meteorological data and other landslide/ground deformation contributing factors. To monitor critical infrastructures, including dams, advanced InSAR methodologies are used for identifying surface movements through time. Monitoring these hazards provides valuable information for understanding processes and could lead to early warning systems to protect people and infrastructure. Validation is carried out through both geotechnical expert evaluations and visual inspections. The success of these systems in Greece has paved the way for their transfer to Cyprus to enhance Cyprus's capabilities in natural hazard assessment and monitoring. This transfer is being made through capacity building activities, fostering continuous collaboration between Greek and Cypriot experts. Apart from the knowledge transfer, small demonstration actions are implemented to showcase the effectiveness of these technologies in real-world scenarios. In conclusion, the transfer of advanced natural hazard assessment technologies from Greece to Cyprus represents a significant step forward in enhancing the region's resilience to disasters. EXCELSIOR project funds knowledge exchange, demonstration actions and capacity-building activities and is committed to empower Cyprus with the tools and expertise to effectively manage and mitigate the risks associated with these natural hazards. Acknowledgement:Authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project.Keywords: earth observation, monitoring, natural hazards, remote sensing
Procedia PDF Downloads 3823588 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter
Authors: Jisun Lee, Jay Hyoun Kwon
Abstract:
As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain
Procedia PDF Downloads 34923587 An Application of Remote Sensing for Modeling Local Warming Trend
Authors: Khan R. Rahaman, Quazi K. Hassan
Abstract:
Global changes in climate, environment, economies, populations, governments, institutions, and cultures converge in localities. Changes at a local scale, in turn, contribute to global changes as well as being affected by them. Our hypothesis is built on a consideration that temperature does vary at local level (i.e., termed as local warming) in comparison to the predicted models at the regional and/or global scale. To date, the bulk of the research relating local places to global climate change has been top-down, from the global toward the local, concentrating on methods of impact analysis that use as a starting point climate change scenarios derived from global models, even though these have little regional or local specificity. Thus, our focus is to understand such trends over the southern Alberta, which will enable decision makers, scientists, researcher community, and local people to adapt their policies based on local level temperature variations and to act accordingly. Specific objectives in this study are: (i) to understand the local warming (temperature in particular) trend in context of temperature normal during the period 1961-2010 at point locations using meteorological data; (ii) to validate the data by using specific yearly data, and (iii) to delineate the spatial extent of the local warming trends and understanding influential factors to adopt situation by local governments. Existing data has brought the evidence of such changes and future research emphasis will be given to validate this hypothesis based on remotely sensed data (i.e. MODIS product by NASA).Keywords: local warming, climate change, urban area, Alberta, Canada
Procedia PDF Downloads 33923586 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges
Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch
Abstract:
Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.Keywords: big data interpretation, datathon, systems toxicology, verification
Procedia PDF Downloads 27823585 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 26323584 A Numerical Study on the Flow in a Pipe with Perforated Plates
Authors: Myeong Hee Jeong, Man Young Kim
Abstract:
The use of perforated plate and tubes is common in applications such as vehicle exhaust silencers, attenuators in air moving ducts and duct linings in jet engines. Also, perforated plate flow conditioners designed to improve flow distribution upstream of an orifice plate flow meter typically have 50–60% free area but these generally employ a non-uniform distribution of holes of several sizes to encourage the formation of a fully developed pipe flow velocity distribution. In this study, therefore, numerical investigations on the flow characteristics with the various perforated plates have been performed and then compared to the case without a perforated plate. Three different models are adopted such as a flat perforated plate, a convex perforated plate in the direction of the inlet, and a convex perforated plate in the direction of the outlet. Simulation results show that the pressure drop with and without perforated plates are similar each other. However, it can be found that that the different shaped perforated plates influence the velocity contour, flow uniformity index, and location of the fully developed fluid flow. These results can be used as a practical guide to the best design of pipe with the perforated plate.Keywords: perforated plate, flow uniformity, pipe turbulent flow, CFD (Computational Fluid Dynamics)
Procedia PDF Downloads 69123583 Addressing Undernourishment of Pupils in a Depressed Community through Feeding Program and Vitamin Supplementation
Authors: Alma M. Corpuz
Abstract:
This study evaluated the supplemental feeding program for 59 undernourished pupils in an elementary school located in one of the depressed communities in Tarlac City, Philippines in SY 2013-2014. Pupils were fed for one month with heavy breakfast and afternoon snacks. They were also given vitamins daily. Findings revealed that most of the pupils regained normal Body Mass Indices (BMIs) during a routine weighing in the school opening. In addition, results revealed that the academic performance of the pupils in the 4th Quarter, after the feeding program, was higher compared to the 3rd Quarter period. The researchers recommended that school extension programs should prioritize activities to address malnutrition among pupils to help them perform well in academics. In addition, feeding programs must include heavy meal plans like what was implemented in this project. The feeding program must also include giving of milk and vitamins to ensure significant improvement in their nutrition. It is also important that feacalysis and deworming be performed before the feeding program and proper handwashing be integrated into the feeding activity.Keywords: wasted, severely wasted, body mass index, supplemental feeding
Procedia PDF Downloads 27723582 The Influence of Beta Shape Parameters in Project Planning
Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou
Abstract:
Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.Keywords: beta distribution, PERT, Monte Carlo simulation, skewness, project completion time distribution
Procedia PDF Downloads 14923581 Ways for Improving Citation of the Cyrillic Publications
Authors: Victoria Y. Garnova, Vladimir G. Merzlikin, Denis G. Yakovlev, Andrei А. Amelenkov, Sergey V. Khudyakov
Abstract:
Assessment of novelty of studies submitted in Russian publications is given by the method citation analysis to identify scientific research with a high degree of innovation. This may be the basis of recommendations for subjects new joint projects setting of the RF and the EU. Apart from not the best rating of Russian publications (may even its lack) current IT ensure open access to the WEB-sites of these journals that make possible own expertise selective rapid assessment of the advanced developments in Russia by interested foreign investors. Cited foreign literature in Russian journals can become the subject of study to determine the innovative attractiveness of scientific research on the background a specific future-proof abroad. Authors introduced: (1) linguistic impact factor Li-f of journals for describing the share of publications in the majority language; (2) linguistic citation index Lact characterizing the significance of scientific research and linguistic top ones Ltop for evaluation of the spectral width of citing of foreign journals.Keywords: citation analysis, linguistic citation indexes, linguistic impact factor, innovative projects
Procedia PDF Downloads 317