Search results for: sisks classification
396 Multiscale Simulation of Absolute Permeability in Carbonate Samples Using 3D X-Ray Micro Computed Tomography Images Textures
Authors: M. S. Jouini, A. Al-Sumaiti, M. Tembely, K. Rahimov
Abstract:
Characterizing rock properties of carbonate reservoirs is highly challenging because of rock heterogeneities revealed at several length scales. In the last two decades, the Digital Rock Physics (DRP) approach was implemented successfully in sandstone rocks reservoirs in order to understand rock properties behaviour at the pore scale. This approach uses 3D X-ray Microtomography images to characterize pore network and also simulate rock properties from these images. Even though, DRP is able to predict realistic rock properties results in sandstone reservoirs it is still suffering from a lack of clear workflow in carbonate rocks. The main challenge is the integration of properties simulated at different scales in order to obtain the effective rock property of core plugs. In this paper, we propose several approaches to characterize absolute permeability in some carbonate core plugs samples using multi-scale numerical simulation workflow. In this study, we propose a procedure to simulate porosity and absolute permeability of a carbonate rock sample using textures of Micro-Computed Tomography images. First, we discretize X-Ray Micro-CT image into a regular grid. Then, we use a textural parametric model to classify each cell of the grid using supervised classification. The main parameters are first and second order statistics such as mean, variance, range and autocorrelations computed from sub-bands obtained after wavelet decomposition. Furthermore, we fill permeability property in each cell using two strategies based on numerical simulation values obtained locally on subsets. Finally, we simulate numerically the effective permeability using Darcy’s law simulator. Results obtained for studied carbonate sample shows good agreement with the experimental property.Keywords: multiscale modeling, permeability, texture, micro-tomography images
Procedia PDF Downloads 183395 Real-World Prevalence of Musculoskeletal Disorders in Nigeria
Authors: F. Fatoye, C. E. Mbada, T. Gebrye, A. O. Ogunsola, C. Fatoye, O. Oyewole
Abstract:
Musculoskeletal disorders (MSDs) are a major cause of pain and disability. It is likely to become a greater economic and public health burden that is unnecessary. Thus, reliable prevalence figures are important for both clinicians and policy-makers to plan health care needs for those affected with the disease. This study estimated hospital based real-world prevalence of MSDs in Nigeria. A review of medical charts for adult patients attending Physiotherapy Outpatient Clinic at the Obafemi Awolowo University Teaching Hospitals Complex, Osun State, Nigeria between 2009 and 2018 was carried out to identify common MSDs including low back pain (LBP), cervical spondylosis (CSD), post immobilization stiffness (PIS), sprain, osteoarthritis (OA), and other conditions. Occupational class of the patients was determined using the International Labour Classification (ILO). Data were analysed using descriptive statistics of frequency and percentages. Overall, medical charts of 3,340 patients were reviewed within the span of ten years (2009 to 2018). Majority of the patients (62.8%) were in the middle class, and the remaining were in low class (25.1%) and high class (10.5%) category. An overall prevalence of 47.35% of MSD was found within the span of ten years. Of this, the prevalence of LBP, CSD, PIS, sprain, OA, and other conditions was 21.6%, 10%, 18.9%, 2%, 6.3%, and 41.3%, respectively. The highest (14.2%) and lowest (10.5%) prevalence of MSDs was recorded in the year of 2012 and 2018, respectively. The prevalence of MSDs is considerably high among Nigerian patients attending outpatient a physiotherapy clinic. The high prevalence of MSDs underscores the need for clinicians and decision makers to put in place appropriate strategies to reduce the prevalence of these conditions. In addition, they should plan and evaluate healthcare services to improve the health outcomes of patients with MSDs. Further studies are required to determine the economic burden of the condition and examine the clinical and cost-effectiveness of physiotherapy interventions for patients with MSDs.Keywords: musculoskeletal disorders, Nigeria, prevalence, real world
Procedia PDF Downloads 172394 Understanding Student Engagement through Sentiment Analytics of Response Times to Electronically Shared Feedback
Authors: Yaxin Bi, Peter Nicholl
Abstract:
The rapid advancement of Information and communication technologies (ICT) is extremely influencing every aspect of Higher Education. It has transformed traditional teaching, learning, assessment and feedback into a new era of Digital Education. This also introduces many challenges in capturing and understanding student engagement with their studies in Higher Education. The School of Computing at Ulster University has developed a Feedback And Notification (FAN) Online tool that has been used to send students links to personalized feedback on their submitted assessments and record students’ frequency of review of the shared feedback as well as the speed of collection. The feedback that the students initially receive is via a personal email directing them through to the feedback via a URL link that maps to the feedback created by the academic marker. This feedback is typically a Word or PDF report including comments and the final mark for the work submitted approximately three weeks before. When the student clicks on the link, the student’s personal feedback is viewable in the browser and they can view the contents. The FAN tool provides the academic marker with a report that includes when and how often a student viewed the feedback via the link. This paper presents an investigation into student engagement through analyzing the interaction timestamps and frequency of review by the student. We have proposed an approach to modeling interaction timestamps and use sentiment classification techniques to analyze the data collected over the last five years for a set of modules. The data studied is across a number of final years and second-year modules in the School of Computing. The paper presents the details of quantitative analysis methods and describes further their interactions with the feedback overtime on each module studied. We have projected the students into different groups of engagement based on sentiment analysis results and then provide a suggestion of early targeted intervention for the set of students seen to be under-performing via our proposed model.Keywords: feedback, engagement, interaction modelling, sentiment analysis
Procedia PDF Downloads 103393 Report of Soundings in Tappeh Shahrestan in Order to Determine Its Field and Propose Privacy, Documenting and Systematic Review of Geophysical Studies
Authors: Reza Mehrafarin, Nafiseh Mirshekari, Mahyar Mehrafarin
Abstract:
In 25 km southeast of Zabul (center of Sistan, in the east of Iran), a large hill can be seen. This hill, which is located next to the bend of the Sistan river, is known as the Tappeh Shahrestan. The length of the Tappeh Shahrestan is 1350 meters, its width is 360 meters, and its height is 20 meters, which in total reaches to 48 hectares. The capital of Sistan province was Ram Shahrestan in the Sassanid period, according to Iranian historical texts and Sassanid Pahlavi traditions. The city was abandoned because the nearby river dried up. Then another capital was built in Sistan called Zarang. But due to the long passage of time since the destruction of the city, its real location was forgotten and and some archaeologists have suggested different areas as the main location of the Ram Shahrestan. In 2018, the first archaeological field activities took place on and around the hillin order to answer this question: was Tappe Shahristan the same as Ram Shahristan, the capital of Sistan, during the Sassanid period? In order to answer this question, archaeological field activities were carried out on and around the hill. The field activities of the first season included the followings: 1- Preparation of hill topography and plan metric 3-Archaeogeophysics studies 3-Methodical study of archeology 4-Determining the range of the hill by soundings5-Documentation of the hill 6-Classification, typology, and comparison of pottery typology. The results of archaeological field activities in the first phase of Tappeh Shahrestan showed that this ancient site was the same city of Ram Shahrestan, the capital of Sistan, during the Sassanid period. The beginning of settlement in this city was the third century BC and the time of leaving was the end of the third century AD. The most important factors in the creation of the city was the abundant water of the Sistan River and its convenient location, and the most important reason for the abandonment of the city was the Sistan River, whose water completely dried up.Keywords: archaeological surveys, archaeological soundings, ram shahrestan, sistan, tappeh shahrestan
Procedia PDF Downloads 110392 COVID-19 and Heart Failure Outcomes: Readmission Insights from the 2020 United States National Readmission Database
Authors: Induja R. Nimma, Anand Reddy Maligireddy, Artur Schneider, Melissa Lyle
Abstract:
Background: Although heart failure is one of the most common causes of hospitalization in adult patients, there is limited knowledge on outcomes following initial hospitalization for COVID-19 with heart failure (HCF-19). We felt it pertinent to analyze 30-day readmission causes and outcomes among patients with HCF-19 using the United States using real-world big data via the National readmission database. Objective: The aim is to describe the rate and causes of readmissions and morbidity of heart failure with coinciding COVID-19 (HFC-19) in the United States, using the 2020 National Readmission Database (NRD). Methods: A descriptive, retrospective study was conducted on the 2020 NRD, a nationally representative sample of all US hospitalizations. Adult (>18 years) inpatient admissions with COVID-19 with HF and readmissions in 30 days were selected based on the International Classification of Diseases-Tenth Revision, Procedure Code. Results: In 2020, 2,60,372 adult patients were hospitalized with COVID-19 and HF. The median age was 74 (IQR: 64-83), and 47% were female. The median length of stay was 7(4-13) days, and the total cost of stay was 62,025 (31,956 – 130,670) United States dollars, respectively. Among the index hospital admissions, 61,527 (23.6%) died, and 22,794 (11.5%) were readmitted within 30 days. The median age of patients readmitted in 30 days was 73 (63-82), 45% were female, and 1,962 (16%) died. The most common principal diagnosis for readmission in these patients was COVID-19= 34.8%, Sepsis= 16.5%, HF = 7.1%, AKI = 2.2%, respiratory failure with hypoxia =1.7%, and Pneumonia = 1%. Conclusion: The rate of readmission in patients with heart failure exacerbations is increasing yearly. COVID-19 was observed to be the most common principal diagnosis in patients readmitted within 30 days. Complicated hypertension, chronic pulmonary disease, complicated diabetes, renal failure, alcohol use, drug use, and peripheral vascular disorders are risk factors associated with readmission. Familiarity with the most common causes and predictors for readmission helps guide the development of initiatives to minimize adverse outcomes and the cost of medical care.Keywords: Covid-19, heart failure, national readmission database, readmission outcomes
Procedia PDF Downloads 79391 The Relationship between Representational Conflicts, Generalization, and Encoding Requirements in an Instance Memory Network
Authors: Mathew Wakefield, Matthew Mitchell, Lisa Wise, Christopher McCarthy
Abstract:
The properties of memory representations in artificial neural networks have cognitive implications. Distributed representations that encode instances as a pattern of activity across layers of nodes afford memory compression and enforce the selection of a single point in instance space. These encoding schemes also appear to distort the representational space, as well as trading off the ability to validate that input information is within the bounds of past experience. In contrast, a localist representation which encodes some meaningful information into individual nodes in a network layer affords less memory compression while retaining the integrity of the representational space. This allows the validity of an input to be determined. The validity (or familiarity) of input along with the capacity of localist representation for multiple instance selections affords a memory sampling approach that dynamically balances the bias-variance trade-off. When the input is familiar, bias may be high by referring only to the most similar instances in memory. When the input is less familiar, variance can be increased by referring to more instances that capture a broader range of features. Using this approach in a localist instance memory network, an experiment demonstrates a relationship between representational conflict, generalization performance, and memorization demand. Relatively small sampling ranges produce the best performance on a classic machine learning dataset of visual objects. Combining memory validity with conflict detection produces a reliable confidence judgement that can separate responses with high and low error rates. Confidence can also be used to signal the need for supervisory input. Using this judgement, the need for supervised learning as well as memory encoding can be substantially reduced with only a trivial detriment to classification performance.Keywords: artificial neural networks, representation, memory, conflict monitoring, confidence
Procedia PDF Downloads 127390 Investigating Informal Vending Practices and Social Encounters along Commercial Streets in Cairo, Egypt
Authors: Dalya M. Hassan
Abstract:
Marketplaces and commercial streets represent some of the most used and lively urban public spaces. Not only do they provide an outlet for commercial exchange, but they also facilitate social and recreational encounters. Such encounters can be influenced by both formal as well as informal vending activities. This paper explores and documents forms of informal vending practices and how they relate to social patterns that occur along the sidewalks of Commercial Streets in Cairo. A qualitative single case study approach of ‘Midan El Gami’ marketplace in Heliopolis, Cairo is adopted. The methodology applied includes direct and walk-by observations for two main commercial streets in the marketplace. Four zoomed-in activity maps are also done for three sidewalk segments that displayed varying vending and social features. Main findings include a documentation and classification of types of informal vending practices as well as a documentation of vendors’ distribution patterns in the urban space. Informal vending activities mainly included informal street vendors and shop spillovers, either as product or seating spillovers. Results indicated that staying and lingering activities were more prevalent in sidewalks that had certain physical features, such as diversity of shops, shaded areas, open frontages, and product or seating spillovers. Moreover, differences in social activity patterns were noted between sidewalks with street vendors and sidewalks with spillovers. While the first displayed more buying, selling, and people watching activities, the latter displayed more social relations and bonds amongst traders’ communities and café patrons. Ultimately, this paper provides a documentation, which suggests that informal vending can have a positive influence on creating a lively commercial street and on resulting patterns of use on the sidewalk space. The results can provide a basis for further investigations and analysis concerning this topic. This could aid in better accommodating informal vending activities within the design of future commercial streets.Keywords: commercial streets, informal vending practices, sidewalks, social encounters
Procedia PDF Downloads 163389 Using Machine Learning to Classify Different Body Parts and Determine Healthiness
Authors: Zachary Pan
Abstract:
Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.Keywords: body part, healthcare, machine learning, neural networks
Procedia PDF Downloads 103388 Supply, Trade-offs, and Synergies Estimation for Regulating Ecosystem Services of a Local Forest
Authors: Jang-Hwan Jo
Abstract:
The supply management of ecosystem services of local forests is an essential issue as it is linked to the ecological welfare of local residents. This study aims to estimate the supply, trade-offs, and synergies of local forest regulating ecosystem services using a land cover classification map (LCCM) and a forest types map (FTM). Rigorous literature reviews and Expert Delphi analysis were conducted using the detailed variables of 1:5,000 LCCM and FTM. Land-use scoring method and Getis-Ord Gi* Analysis were utilized on detailed variables to propose a method for estimating supply, trade-offs, and synergies of the local forest regulating ecosystem services. The analysis revealed that the rank order (1st to 5th) of supply of regulating ecosystem services was Erosion prevention, Air quality regulation, Heat island mitigation, Water quality regulation, and Carbon storage. When analyzing the correlation between defined services of the entire city, almost all services showed a synergistic effect. However, when analyzing locally, trade-off effects (Heat island mitigation – Air quality regulation, Water quality regulation – Air quality regulation) appeared in the eastern and northwestern forest areas. This suggests the need to consider not only the synergy and trade-offs of the entire forest between specific ecosystem services but also the synergy and trade-offs of local areas in managing the regulating ecosystem services of local forests. The study result can provide primary data for the stakeholders to determine the initial conditions of the planning stage when discussing the establishment of policies related to the adjustment of the supply of regulating ecosystem services of the forests with limited access. Moreover, the study result can also help refine the estimation of the supply of the regulating ecosystem services with the availability of other forms of data.Keywords: ecosystem service, getis ord gi* analysis, land use scoring method, regional forest, regulating service, synergies, trade-offs
Procedia PDF Downloads 87387 Intrastromal Donor Limbal Segments Implantation as a Surgical Treatment of Progressive Keratoconus: Clinical and Functional Results
Authors: Mikhail Panes, Sergei Pozniak, Nikolai Pozniak
Abstract:
Purpose: To evaluate the effectiveness of intrastromal donor limbal segments implantation for treatment of progressive keratoconus considering on main characteristics of corneal endothelial cells. Setting: Outpatient ophthalmic clinic. Methods: Twenty patients (20 eyes) with progressive keratoconus II-III of Amsler classification were recruited. The worst eye was treated with the transplantation of donor limbal segments in the recipient corneal stroma, while the fellow eye was left untreated as a control of functional and morphological changes. Furthermore, twenty patients (20 eyes) without progressive keratoconus was used as a control of corneal endothelial cells changes. All patients underwent a complete ocular examination including uncorrected and corrected distance visual acuity (UDVA, CDVA), slit lamp examination fundus examination, corneal topography and pachymetry, auto-keratometry, Anterior Segment Optical Coherence Tomography and Corneal Endothelial Specular Microscopy. Results: After two years, statistically significant improvement in the UDVA and CDVA (on the average on two lines for UDVA and three-four lines for CDVA) were noted. Besides corneal astigmatism decreased from 5.82 ± 2.64 to 1.92 ± 1.4 D. Moreover there were no statistically significant differences in the changes of mean spherical equivalent, keratometry and pachymetry indicators. It should be noted that after two years there were no significant differences in the changes of the number and form of corneal endothelial cells. It can be regarded as a process stabilization. In untreated control eyes, there was a general trend towards worsening of UDVA, CDVA and corneal thickness, while corneal astigmatism was increased. Conclusion: Intrastromal donor segments implantation is a safe technique for keratoconus treatment. Intrastromal donor segments implantation is an efficient procedure to stabilize and improve progressive keratoconus.Keywords: corneal endothelial cells, intrastromal donor limbal segments, progressive keratoconus, surgical treatment of keratoconus
Procedia PDF Downloads 281386 Green Accounting and Firm Performance: A Bibliometric Literature Review
Authors: Francesca di Donato, Sara Trucco
Abstract:
Green accounting is a growing topic of interest. Indeed, nowadays, most firms affect the environment; therefore, companies are seeking the best way to disclose environmental information. Furthermore, companies are increasingly committed to improving the environment, and the topic is gaining more importance to the public, governments, and policymakers. Green accounting is a type of accounting that considers environmental costs and their impact on the financial performance of firms. Thus, the motivation of the current research is to investigate the state-of-the-art literature on the relationship between green accounting and firm performance since the birth of the topic of green accounting and to investigate gaps in the literature that represent fruitful terrain for future research. In doing so, this study provides a bibliometric literature review of existing evidence related to the link between green accounting and firm performance since 2000. The search, based on the most relevant databases for scientific journals (which are Scopus, Emerald, Web of Science, Google Scholar, and Econlit), returned 1917 scientific articles. The articles were manually reviewed in order to identify only the relevant studies in the field by excluding articles with titles and abstracts out of scope. The final sample was composed of 107 articles. A content analysis was carried out on the final sample of articles; in doing so, a classification system has been proposed. Findings show the most relevant environmental costs and issues considered in previous studies and how green accounting may be linked to the financial and non-financial performance of a firm. The study also offers suggestions for future research in this domain. This study has several practical implications. Indeed, the topic of green accounting may be applied to different sectors and different types of companies. Therefore, this study may help managers to better understand the most relevant environmental information to disclose and how environmental issues may be managed to improve the performance of the firms. Moreover, the bibliometric literature review may be of interest to those stakeholders who are interested in the historical evolution of the topic.Keywords: bibliometric literature review, firm performance, green accounting, literature review
Procedia PDF Downloads 69385 Harmonization of Financial Information Systems in Latin America in Light of International Public Sector Accounting Standards Using the Herfindahl-Hirschman Index
Authors: Laura Sour
Abstract:
Government accounting is an essential instrument of transparency and accountability in public administration, which allows connecting internal management with the implementation of policies and their evaluation by third parties through the construction of indicators on the cost of government. Several countries have adopted the International Public Sector Accounting Standards (IPSAS) as part of their modernization strategy. This document will evaluate the quantity and harmonization of the financial information published in the financial statements of 12 Latin American countries based on what is established in IPSAS 1, 2 and 17. For this, seven types of financial statements are analyzed. published during the period from 2015 to 2019. Based on this information, it will be possible to describe the evolution in the government financial publication to carry out a detailed analysis of the items that have been most transparent in these countries. Finally, the level of harmonization of the financial statements will be studied using the Herfindahl-Hirschman index (IHH) to determine the degree of comparability of the information. To date, the results indicate that the public sector has increased the quantity and harmonization of the financial information published during the study period, but in a heterogeneous way: From the data collected, it has been found that the financial statement published with greater frequency and quantity is the Income Statement (classification of expenses by nature). On the other hand, the most complete reports were published by Costa Rica (2017 to 2019) and Mexico (2016 to 2018), periods during which these countries complied with 92.9 percent of the items analyzed. Although 2017 and 2018 are the years in which the most financial statements were reported, it is important to mention that Mexico is the country that has published the most financial information throughout the entire study period. The use of the IHH is expected to provide accurate information on the quality with which countries have adopted IPSAS within their government accounting systems to promote transparency and accountability in the continent.Keywords: accounting and auditing, government policy and regulation, harmonization, public sector accounting and audits IPSAS
Procedia PDF Downloads 90384 Comparative Analysis of Change in Vegetation in Four Districts of Punjab through Satellite Imagery, Land Use Statistics and Machine Learning
Authors: Mirza Waseem Abbas, Syed Danish Raza
Abstract:
For many countries agriculture is still the major force driving the economy and a critically important socioeconomic sector, despite exceptional industrial development across the globe. In countries like Pakistan, this sector is considered the backbone of the economy, and most of the economic decision making revolves around agricultural outputs and data. Timely and accurate facts and figures about this vital sector hold immense significance and have serious implications for the long-term development of the economy. Therefore, any significant improvements in the statistics and other forms of data regarding agriculture sector are considered important by all policymakers. This is especially true for decision making for the betterment of crops and the agriculture sector in general. Provincial and federal agricultural departments collect data for all cash and non-cash crops and the sector, in general, every year. Traditional data collection for such a large sector i.e. agriculture, being time-consuming, prone to human error and labor-intensive, is slowly but gradually being replaced by remote sensing techniques. For this study, remotely sensed data were used for change detection (machine learning, supervised & unsupervised classification) to assess the increase or decrease in area under agriculture over the last fifteen years due to urbanization. Detailed Landsat Images for the selected agricultural districts were acquired for the year 2000 and compared to images of the same area acquired for the year 2016. Observed differences validated through detailed analysis of the areas show that there was a considerable decrease in vegetation during the last fifteen years in four major agricultural districts of the Punjab province due to urbanization (housing societies).Keywords: change detection, area estimation, machine learning, urbanization, remote sensing
Procedia PDF Downloads 249383 Legal Judgment Prediction through Indictments via Data Visualization in Chinese
Authors: Kuo-Chun Chien, Chia-Hui Chang, Ren-Der Sun
Abstract:
Legal Judgment Prediction (LJP) is a subtask for legal AI. Its main purpose is to use the facts of a case to predict the judgment result. In Taiwan's criminal procedure, when prosecutors complete the investigation of the case, they will decide whether to prosecute the suspect and which article of criminal law should be used based on the facts and evidence of the case. In this study, we collected 305,240 indictments from the public inquiry system of the procuratorate of the Ministry of Justice, which included 169 charges and 317 articles from 21 laws. We take the crime facts in the indictments as the main input to jointly learn the prediction model for law source, article, and charge simultaneously based on the pre-trained Bert model. For single article cases where the frequency of the charge and article are greater than 50, the prediction performance of law sources, articles, and charges reach 97.66, 92.22, and 60.52 macro-f1, respectively. To understand the big performance gap between articles and charges, we used a bipartite graph to visualize the relationship between the articles and charges, and found that the reason for the poor prediction performance was actually due to the wording precision. Some charges use the simplest words, while others may include the perpetrator or the result to make the charges more specific. For example, Article 284 of the Criminal Law may be indicted as “negligent injury”, "negligent death”, "business injury", "driving business injury", or "non-driving business injury". As another example, Article 10 of the Drug Hazard Control Regulations can be charged as “Drug Control Regulations” or “Drug Hazard Control Regulations”. In order to solve the above problems and more accurately predict the article and charge, we plan to include the article content or charge names in the input, and use the sentence-pair classification method for question-answer problems in the BERT model to improve the performance. We will also consider a sequence-to-sequence approach to charge prediction.Keywords: legal judgment prediction, deep learning, natural language processing, BERT, data visualization
Procedia PDF Downloads 121382 Pediatric Emergency Dental Visits at King Abdulaziz University Dental Hospital during the COVID-19 Lockdown: A Retrospective Study
Authors: Sara Alhabli, Eman Elashiry, Osama Felemban, Abdullah Almushayt, Faisal Dardeer, Ahmed Mohammad, Fajr Orri, Nada Bamashmous
Abstract:
Background: In December of 2019, the coronavirus (SARS-CoV-2) first appeared and quickly spread to become a worldwide pandemic. This study aimed to evaluate the prevalence and types of pediatric dental emergencies during the COVID-19 lockdown in Jeddah, Saudi Arabia, at the University Dental Hospital (UDH) of King Abdulaziz University (KAU) and identified the management provided for these dental emergency visits. Materials and Methods: Data collection was done retrospectively from electronic dental records for children aged 0-18 that attended the UDH emergency clinic during the period from March 1st, 2020, to September 30th, 2020. An electronic form formulated specifically for this study was used to collect the required data from electronic patient records, including demographic data, emergency classification, management, and referrals. Results: A total of 3146 patients were seen at the emergency clinics during this period, of which 661 were children (21%). Types of emergency conditions included 0.8% emergency cases, 34% urgent, and 65.2% non-urgent conditions. Severe dental pain (73.1%) and abscesses (20%) were the most common urgent dental conditions. Most non-urgent conditions presented for initial or periodic visits, recalls, or routine radiographs (74%). Treatments rarely involved restorations, with 8% among urgent conditions and 5.4% among non-urgent conditions. Antibiotics were only prescribed to 6.9% of urgent conditions. Conclusions: The largest group of children presenting at the emergency dental clinics were found to be children with non-urgent conditions. Tele dentistry can be a solution to avoid large numbers of non-urgent patients presenting to emergency clinics. Additionally, dental care for non-urgent conditions during the pandemic should focus more on procedures with less aerosol generation.Keywords: COVID-19 pandemic, dental emergencies, oral health, pediatric dentistry, children
Procedia PDF Downloads 97381 An Inventory Management Model to Manage the Stock Level for Irregular Demand Items
Authors: Riccardo Patriarca, Giulio Di Gravio, Francesco Costantino, Massimo Tronci
Abstract:
An accurate inventory management policy acquires a crucial role in the several high-availability sectors. In these sectors, due to the high-cost of spares and backorders, an (S-1, S) replenishment policy is necessary for high-availability items. The policy enables the shipment of a substitute efficient item anytime the inventory size decreases by one. This policy can be modelled following the Multi-Echelon Technique for Recoverable Item Control (METRIC). The METRIC is a system-based technique that allows defining the optimum stock level in a multi-echelon network, adopting measures in line with the decision-maker’s perspective. The METRIC defines an availability-cost function with inventory costs and required service levels, using as inputs data about the demand trend, the supplying and maintenance characteristics of the network and the budget/availability constraints. The traditional METRIC relies on the hypothesis that a Poisson distribution well represents the demand distribution in case of items with a low failure rate. However, in this research, we will explore the effects of using a Poisson distribution to model the demand of low failure rate items characterized by an irregular demand trend. This characteristic of a demand is not included in the traditional METRIC formulation leading to the need of revising its traditional formulation. Using the CV (Coefficient of Variation) and ADI (Average inter-Demand Interval) classification, we will define the inherent flaws of Poisson-based METRIC for irregular demand items, defining an innovative ad hoc distribution which can better fit the irregular demands. This distribution will allow defining proper stock levels to reduce stocking and backorder costs due to the high irregularities in the demand trend. A case study in the aviation domain will clarify the benefits of this innovative METRIC approach.Keywords: METRIC, inventory management, irregular demand, spare parts
Procedia PDF Downloads 347380 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data
Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill
Abstract:
Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function
Procedia PDF Downloads 279379 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)
Procedia PDF Downloads 194378 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education
Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue
Abstract:
In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education
Procedia PDF Downloads 107377 Guidelines to Designing Generic Protocol for Responding to Chemical, Biological, Radiological and Nuclear Incidents
Authors: Mohammad H. Yarmohammadian, Mehdi Nasr Isfahani, Elham Anbari
Abstract:
Introduction: The awareness of using chemical, biological, and nuclear agents in everyday industrial and non-industrial incidents has increased recently; release of these materials can be accidental or intentional. Since hospitals are the forefronts of confronting Chemical, Biological, Radiological and Nuclear( CBRN) incidents, the goal of the present research was to provide a generic protocol for CBRN incidents through a comparative review of CBRN protocols and guidelines of different countries and reviewing various books, handbooks and papers. Method: The integrative approach or research synthesis was adopted in this study. First a simple narrative review of programs, books, handbooks, and papers about response to CBRN incidents in different countries was carried out. Then the most important and functional information was discussed in the form of a generic protocol in focus group sessions and subsequently confirmed. Results: Findings indicated that most of the countries had various protocols, guidelines, and handbooks for hazardous materials or CBRN incidents. The final outcome of the research synthesis was a 50 page generic protocol whose main topics included introduction, definition and classification of CBRN agents, four major phases of incident and disaster management cycle, hospital response management plan, equipment, and recommended supplies and antidotes for decontamination (radiological/nuclear, chemical, biological); each of these also had subtopics. Conclusion: In the majority of international protocols, guidelines, handbooks and also international and Iranian books and papers, there is an emphasis on the importance of incident command system, determining the safety degree of decontamination zones, maps of decontamination zones, decontamination process, triage classifications, personal protective equipment, and supplies and antidotes for decontamination; these are the least requirements for such incidents and also consistent with the provided generic protocol.Keywords: hospital, CBRN, decontamination, generic protocol, CBRN Incidents
Procedia PDF Downloads 295376 Sound Analysis of Young Broilers Reared under Different Stocking Densities in Intensive Poultry Farming
Authors: Xiaoyang Zhao, Kaiying Wang
Abstract:
The choice of stocking density in poultry farming is a potential way for determining welfare level of poultry. However, it is difficult to measure stocking densities in poultry farming because of a lot of variables such as species, age and weight, feeding way, house structure and geographical location in different broiler houses. A method was proposed in this paper to measure the differences of young broilers reared under different stocking densities by sound analysis. Vocalisations of broilers were recorded and analysed under different stocking densities to identify the relationship between sounds and stocking densities. Recordings were made continuously for three-week-old chickens in order to evaluate the variation of sounds emitted by the animals at the beginning. The experimental trial was carried out in an indoor reared broiler farm; the audio recording procedures lasted for 5 days. Broilers were divided into 5 groups, stocking density treatments were 8/m², 10/m², 12/m² (96birds/pen), 14/m² and 16/m², all conditions including ventilation and feed conditions were kept same except from stocking densities in every group. The recordings and analysis of sounds of chickens were made noninvasively. Sound recordings were manually analysed and labelled using sound analysis software: GoldWave Digital Audio Editor. After sound acquisition process, the Mel Frequency Cepstrum Coefficients (MFCC) was extracted from sound data, and the Support Vector Machine (SVM) was used as an early detector and classifier. This preliminary study, conducted in an indoor reared broiler farm shows that this method can be used to classify sounds of chickens under different densities economically (only a cheap microphone and recorder can be used), the classification accuracy is 85.7%. This method can predict the optimum stocking density of broilers with the complement of animal welfare indicators, animal productive indicators and so on.Keywords: broiler, stocking density, poultry farming, sound monitoring, Mel Frequency Cepstrum Coefficients (MFCC), Support Vector Machine (SVM)
Procedia PDF Downloads 161375 Identification of Blood Biomarkers Unveiling Early Alzheimer's Disease Diagnosis Through Single-Cell RNA Sequencing Data and Autoencoders
Authors: Hediyeh Talebi, Shokoofeh Ghiam, Changiz Eslahchi
Abstract:
Traditionally, Alzheimer’s disease research has focused on genes with significant fold changes, potentially neglecting subtle but biologically important alterations. Our study introduces an integrative approach that highlights genes crucial to underlying biological processes, regardless of their fold change magnitude. Alzheimer's Single-cell RNA-seq data related to the peripheral blood mononuclear cells (PBMC) was extracted from the Gene Expression Omnibus (GEO). After quality control, normalization, scaling, batch effect correction, and clustering, differentially expressed genes (DEGs) were identified with adjusted p-values less than 0.05. These DEGs were categorized based on cell-type, resulting in four datasets, each corresponding to a distinct cell type. To distinguish between cells from healthy individuals and those with Alzheimer's, an adversarial autoencoder with a classifier was employed. This allowed for the separation of healthy and diseased samples. To identify the most influential genes in this classification, the weight matrices in the network, which includes the encoder and classifier components, were multiplied, and focused on the top 20 genes. The analysis revealed that while some of these genes exhibit a high fold change, others do not. These genes, which may be overlooked by previous methods due to their low fold change, were shown to be significant in our study. The findings highlight the critical role of genes with subtle alterations in diagnosing Alzheimer's disease, a facet frequently overlooked by conventional methods. These genes demonstrate remarkable discriminatory power, underscoring the need to integrate biological relevance with statistical measures in gene prioritization. This integrative approach enhances our understanding of the molecular mechanisms in Alzheimer’s disease and provides a promising direction for identifying potential therapeutic targets.Keywords: alzheimer's disease, single-cell RNA-seq, neural networks, blood biomarkers
Procedia PDF Downloads 66374 Molecular Identification and Genotyping of Human Brucella Strains Isolated in Kuwait
Authors: Abu Salim Mustafa
Abstract:
Brucellosis is a zoonotic disease endemic in Kuwait. Human brucellosis can be caused by several Brucella species with Brucella melitensis causing the most severe and Brucella abortus the least severe disease. Furthermore, relapses are common after successful chemotherapy of patients. The classical biochemical methods of culture and serology for identification of Brucellae provide information about the species and serotypes only. However, to differentiate between relapse and reinfection/epidemiological investigations, the identification of genotypes using molecular methods is essential. In this study, four molecular methods [16S rRNA gene sequencing, real-time PCR, enterobacterial repetitive intergenic consensus (ERIC)-PCR and multilocus variable-number tandem-repeat analysis (MLVA)-16] were evaluated for the identification and typing of 75 strains of Brucella isolated in Kuwait. The 16S rRNA gene sequencing suggested that all the strains were B. melitensis and real-time PCR confirmed their species identity as B. melitensis. The ERIC-PCR band profiles produced a dendrogram of 75 branches suggesting each strain to be of a unique type. The cluster classification, based on ~ 80% similarity, divided all the ERIC genotypes into two clusters, A and B. Cluster A consisted of 9 ERIC genotypes (A1-A9) corresponding to 9 individual strains. Cluster B comprised of 13 ERIC genotypes (B1-B13) with B5 forming the largest cluster of 51 strains. MLVA-16 identified all isolates as B. melitensis and divided them into 71 MLVA-types. The cluster analysis of MLVA-16-types suggested that most of the strains in Kuwait originated from the East Mediterranean Region, a few from the African group and one new genotype closely matched with the West Mediterranean region. In conclusion, this work demonstrates that B. melitensis, the most pathogenic species of Brucella, is prevalent in Kuwait. Furthermore, MLVA-16 is the best molecular method, which can identify the Brucella species and genotypes as well as determine their origin in the global context. Supported by Kuwait University Research Sector grants MI04/15 and SRUL02/13.Keywords: Brucella, ERIC-PCR, MLVA-16, RT-PCR, 16S rRNA gene sequencing
Procedia PDF Downloads 391373 Evaluating Robustness of Conceptual Rainfall-runoff Models under Climate Variability in Northern Tunisia
Authors: H. Dakhlaoui, D. Ruelland, Y. Tramblay, Z. Bargaoui
Abstract:
To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that are able to be fairly reliable under changing climate conditions. This study aims at assessing the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in Northern Tunisia under long-term climate variability. Their robustness was evaluated according to a differential split sample test based on a climate classification of the observation period regarding simultaneously precipitation and temperature conditions. The studied catchments are situated in a region where climate change is likely to have significant impacts on runoff and they already suffer from scarcity of water resources. They cover the main hydrographical basins of Northern Tunisia (High Medjerda, Zouaraâ, Ichkeul and Cap bon), which produce the majority of surface water resources in Tunisia. The streamflow regime of the basins can be considered as natural since these basins are located upstream from storage-dams and in areas where withdrawals are negligible. A 30-year common period (1970‒2000) was considered to capture a large spread of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while the evaluation of model transferability is performed according to the Nash-Suttfliff efficiency criterion and volume error. The three hydrological models were shown to have similar behaviour under climate variability. Models prove a better ability to simulate the runoff pattern when transferred toward wetter periods compared to the case when transferred to drier periods. The limits of transferability are beyond -20% of precipitation and +1.5 °C of temperature in comparison with the calibration period. The deterioration of model robustness could in part be explained by the climate dependency of some parameters.Keywords: rainfall-runoff modelling, hydro-climate variability, model robustness, uncertainty, Tunisia
Procedia PDF Downloads 292372 Preparedness for Microbial Forensics Evidence Collection on Best Practice
Authors: Victor Ananth Paramananth, Rashid Muniginin, Mahaya Abd Rahman, Siti Afifah Ismail
Abstract:
Safety issues, scene protection, and appropriate evidence collection must be handled in any bio crime scene. There will be a scene or multi-scene to be cordoned for investigation in any bio-incident or bio crime event. Evidence collection is critical in determining the type of microbial or toxin, its lethality, and its source. As a consequence, from the start of the investigation, a proper sampling method is required. The most significant challenges for the crime scene officer would be deciding where to obtain samples, the best sampling method, and the sample sizes needed. Since there could be evidence in liquid, viscous, or powder shape at a crime scene, crime scene officers have difficulty determining which tools to use for sampling. To maximize sample collection, the appropriate tools for sampling methods are necessary. This study aims to assist the crime scene officer in collecting liquid, viscous, and powder biological samples in sufficient quantity while preserving sample quality. Observational tests on sample collection using liquid, viscous, and powder samples for adequate quantity and sample quality were performed using UV light in this research. The density of the light emission varies upon the method of collection and sample types. The best tools for collecting sufficient amounts of liquid, viscous, and powdered samples can be identified by observing UV light. Instead of active microorganisms, the invisible powder is used to assess sufficient sample collection during a crime scene investigation using various collection tools. The liquid, powdered and viscous samples collected using different tools were analyzed using Fourier transform infrared - attenuate total reflection (FTIR-ATR). FTIR spectroscopy is commonly used for rapid discrimination, classification, and identification of intact microbial cells. The liquid, viscous and powdered samples collected using various tools have been successfully observed using UV light. Furthermore, FTIR-ATR analysis showed that collected samples are sufficient in quantity while preserving their quality.Keywords: biological sample, crime scene, collection tool, UV light, forensic
Procedia PDF Downloads 195371 Intrusion Detection in SCADA Systems
Authors: Leandros A. Maglaras, Jianmin Jiang
Abstract:
The protection of the national infrastructures from cyberattacks is one of the main issues for national and international security. The funded European Framework-7 (FP7) research project CockpitCI introduces intelligent intrusion detection, analysis and protection techniques for Critical Infrastructures (CI). The paradox is that CIs massively rely on the newest interconnected and vulnerable Information and Communication Technology (ICT), whilst the control equipment, legacy software/hardware, is typically old. Such a combination of factors may lead to very dangerous situations, exposing systems to a wide variety of attacks. To overcome such threats, the CockpitCI project combines machine learning techniques with ICT technologies to produce advanced intrusion detection, analysis and reaction tools to provide intelligence to field equipment. This will allow the field equipment to perform local decisions in order to self-identify and self-react to abnormal situations introduced by cyberattacks. In this paper, an intrusion detection module capable of detecting malicious network traffic in a Supervisory Control and Data Acquisition (SCADA) system is presented. Malicious data in a SCADA system disrupt its correct functioning and tamper with its normal operation. OCSVM is an intrusion detection mechanism that does not need any labeled data for training or any information about the kind of anomaly is expecting for the detection process. This feature makes it ideal for processing SCADA environment data and automates SCADA performance monitoring. The OCSVM module developed is trained by network traces off line and detects anomalies in the system real time. The module is part of an IDS (intrusion detection system) developed under CockpitCI project and communicates with the other parts of the system by the exchange of IDMEF messages that carry information about the source of the incident, the time and a classification of the alarm.Keywords: cyber-security, SCADA systems, OCSVM, intrusion detection
Procedia PDF Downloads 552370 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 188369 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges
Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch
Abstract:
Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.Keywords: big data interpretation, datathon, systems toxicology, verification
Procedia PDF Downloads 278368 The Use of Geographic Information System for Selecting Landfill Sites in Osogbo
Authors: Nureni Amoo, Sunday Aroge, Oluranti Akintola, Hakeem Olujide, Ibrahim Alabi
Abstract:
This study investigated the optimum landfill site in Osogbo so as to identify suitable solid waste dumpsite for proper waste management in the capital city. Despite an increase in alternative techniques for disposing of waste, landfilling remains the primary means of waste disposal. These changes in attitudes in many parts of the world have been supported by changes in laws and policies regarding the environment and waste disposal. Selecting the most suitable site for landfill can avoid any ecological and socio-economic effects. The increase in industrial and economic development, along with the increase of population growth in Osogbo town, generates a tremendous amount of solid waste within the region. Factors such as the scarcity of land, the lifespan of the landfill, and environmental considerations warrant that the scientific and fundamental studies are carried out in determining the suitability of a landfill site. The analysis of spatial data and consideration of regulations and accepted criteria are part of the important elements in the site selection. This paper presents a multi-criteria decision-making method using geographic information system (GIS) with the integration of the fuzzy logic multi-criteria decision making (FMCDM) technique for landfill suitability site evaluation. By using the fuzzy logic method (classification of suitable areas in the range of 0 to 1 scale), the superposing of the information layers related to drainage, soil, land use/land cover, slope, land use, and geology maps were performed in the study. Based on the result obtained in this study, five (5) potential sites are suitable for the construction of a landfill are proposed, two of which belong to the most suitable zone, and the existing waste disposal site belonged to the unsuitable zone.Keywords: fuzzy logic multi-criteria decision making, geographic information system, landfill, suitable site, waste disposal
Procedia PDF Downloads 142367 Medical versus Non-Medical Students' Opinions about Academic Stress Management Using Unconventional Therapies
Authors: Ramona-Niculina Jurcau, Ioana-Marieta Jurcau, Dong Hun Kwak, Nicolae-Alexandru Colceriu
Abstract:
Background: Stress management (SM) is a topic of great academic interest and equally a task to accomplish. In addition, it is recognized the beneficial role of unconventional therapies (UCT) in stress modulation. Aims: The aim was to evaluate medical (MS) versus non-medical students’ (NMS) opinions about academic stress management (ASM) using UCT. Methods: MS (n=103, third year males and females) and NMS (n=112, males and females, from humanities faculties, different years of study), out of their academic program, voluntarily answered to a questionnaire concerning: a) Classification of the four most important academic stress factors; b) The extent to which their daily life influences academic stress; c) The most important SM methods they know; d) Which of these methods they are applying; e) the UCT they know or about which they have heard; f) Which of these they know to have stress modulation effects; g) Which of these UCT, participants are using or would like to use for modulating stress; and if participants use UTC for their own choose or following a specialist consultation in those therapies (SCT); h) If they heard about the following UCT and what opinion they have (using visual analogue scale) about their use (following CST) for the ASM: Phytotherapy (PT), apitherapy (AT), homeopathy (H), ayurvedic medicine (AM), traditional Chinese medicine (TCM), music therapy (MT), color therapy (CT), forest therapy (FT). Results: Among the four most important academic stress factors, for MS more than for NMS, are: busy schedule, large amount of information taught; high level of performance required, reduced time for relaxing. The most important methods for SM that MS and NMS know, hierarchically are: listen to music, meeting friends, playing sport, hiking, sleep, regularly breaks, seeing positive side, faith; of which, NMS more than MS, are partially applying to themselves. UCT about which MS and less NMS have heard, are phytotherapy, apitherapy, acupuncture, reiki. Of these UTC, participants know to have stress modulation effects: some plants, bee’s products and music; they use or would like to use for ASM (the majority without SCT) certain teas, honey and music. Most of MS and only some NMS heard about PT, AT, TCM, MT and much less about H, AM, CT, TT. NMS more than MS, would use these UCT, following CST. Conclusions: 1) Academic stress is similarly reflected in MS and NMS opinions. 2) MS and NMS apply similar but very few UCT for stress modulation. 3) Information that MS and NMS have about UCT and their ASM application is reduced. 4) It is remarkable that MS and especially NMS, are open to UCT use for ASM, following an SCT.Keywords: academic stress, stress management, stress modulation, medical students, non-medical students, unconventional therapies
Procedia PDF Downloads 356