Search results for: information technology enabled services
2430 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area
Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz
Abstract:
The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.Keywords: deterrence function, four-step modelling, origin destination, transport model
Procedia PDF Downloads 1682429 Possibility of Membrane Filtration to Treatment of Effluent from Digestate
Authors: Marcin Debowski, Marcin Zielinski, Magdalena Zielinska, Paulina Rusanowska
Abstract:
The problem with digestate management is one of the most important factors influencing on the development and operation of biogas plant. Turbidity and bacterial contamination negatively affect the growth of algae, which can limit the use of the effluent in the production of algae biomass on a large scale. These problems can be overcome by cultivating of algae species resistant to environmental factors, such as Chlorella sp., Scenedesmus sp., or reducing load of organic compounds to prevent bacterial contamination. The effluent requires dilution and/or purification. One of the methods of effluent treatment is the use of a membrane technology such as microfiltration (MF), ultrafiltration (UF), nanofiltration (NF) and reverse osmosis (RO), depending on the membrane pore size and the cut off point. Membranes are a physical barrier to solids and particles larger than the size of the pores. MF membranes have the largest pores and are used to remove turbidity, suspensions, bacteria and some viruses. UF membranes remove also color, odor and organic compounds with high molecular weight. In treatment of wastewater or other waste streams, MF and UF can provide a sufficient degree of purification. NF membranes are used to remove natural organic matter from waters, water disinfection products and sulfates. RO membranes are applied to remove monovalent ions such as Na⁺ or K⁺. The effluent was used in UF for medium to cultivation of two microalgae: Chlorella sp. and Phaeodactylum tricornutum. Growth rates of Chlorella sp. and P. tricornutum were similar: 0.216 d⁻¹ and 0.200 d⁻¹ (Chlorella sp.); 0.128 d⁻¹ and 0.126 d⁻¹ (P. tricornutum), on synthetic medium and permeate from UF, respectively. The final biomass composition was also similar, regardless of the medium. Removal of nitrogen was 92% and 71% by Chlorella sp. and P. tricornutum, respectively. The fermentation effluents after UF and dilution were also used for cultivation of algae Scenedesmus sp. that is resistant to environmental conditions. The authors recommended the development of biorafinery based on the production of algae for the biogas production. There are examples of using a multi-stage membrane system to purify the liquid fraction from digestate. After the initial UF, RO is used to remove ammonium nitrogen and COD. To obtain a permeate with a concentration of ammonium nitrogen allowing to discharge it into the environment, it was necessary to apply three-stage RO. The composition of the permeate after two-stage RO was: COD 50–60 mg/dm³, dry solids 0 mg/dm³, ammonium nitrogen 300–320 mg/dm³, total nitrogen 320–340 mg/dm³, total phosphorus 53 mg/dm³. However compostion of permeate after three-stage RO was: COD < 5 mg/dm³, dry solids 0 mg/dm³, ammonium nitrogen 0 mg/dm³, total nitrogen 3.5 mg/dm³, total phosphorus < 0,05 mg/dm³. Last stage of RO might be replaced by ion exchange process. The negative aspect of membrane filtration systems is the fact that the permeate is about 50% of the introduced volume, the remainder is the retentate. The management of a retentate might involve recirculation to a biogas plant.Keywords: digestate, membrane filtration, microalgae cultivation, Chlorella sp.
Procedia PDF Downloads 3522428 Studies on Climatic and Soil Site Suitability of Major Grapes-Growing Soils of Eastern and Southern Dry Zones of Karnataka
Authors: Harsha B. R., Anil Kumar K. S.
Abstract:
Climate and soils are the two most dynamic entities among the factors affecting growth and grapes productivity. Studying of prevailing climate over the years in a region provides sufficient information related to management practices to be carried out in vineyards. Evaluating the suitability of vineyard soils under different climatic conditions serves as the yardstick to analyse the performance of grapevines. This study was formulated to study the climate and evaluate the site-suitability of soils in vineyards of southern Karnataka, which has registered its superiority in the quality production of wine. Ten soil profiles were excavated for suitability evaluation of soils, and six taluks were studied for climatic analysis. In almost all the regions studied, recharge starts at the end of the May or June months, peaking in either September or October months. Soil Starts drying from mid of December months in the taluks studied. Bangalore North (Rajanukunte) soils were highly suited for grapes cultivation with no or slight limitations. Bangalore North (GKVK Farm) was moderately suited with slight to moderate limitations of slope and available nitrogen content. Moderate suitability was observed in the rest of the profiles studied in Eastern dry zone soils with the slight to moderate limitations of either organic carbon or available nitrogen or both in the Eastern dry zone. Magadi (Southern dry zone) soils were moderately suitable with slight to moderate limitations of graveliness, available nitrogen, organic carbon, and exchangeable sodium percentage. Sustainable performance of vineyards in terms of yield can be achieved in these taluks by managing the constraints existing in soils.Keywords: climatic analysis, dry zone, water recharge, growing period, suitability, sustainability
Procedia PDF Downloads 1242427 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features
Authors: Bushra Zafar, Usman Qamar
Abstract:
Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection
Procedia PDF Downloads 3162426 Geochemical Studies of Mud Volcanoes Fluids According to Petroleum Potential of the Lower Kura Depression (Azerbaijan)
Authors: Ayten Bakhtiyar Khasayeva
Abstract:
Lower Kura depression is a part of the South Caspian Basin (SCB), located between the folded regions of the Greater and Lesser Caucasus. The region is characterized by thick sedimentary cover 22 km (SCB up to 30 km), high sedimentation rate, low geothermal gradient (average value corresponds to 2 °C / 100m). There is Quaternary, Pliocene, Miocene and Oligocene deposits take part in geological structure. Miocene and Oligocene deposits are opened by prospecting and exploratory wells in the areas of Kalamaddin and Garabagli. There are 25 mud volcanoes within the territory of the Lower Kura depression, which are the unique source of information about hydrocarbons contenting great depths. During the wells data research, solid erupted products and mud volcano fluids, and according to the geological and thermal characteristics of the region, it was determined that the main phase of the hydrocarbon generation (MK1-AK2) corresponds to a wide range of depths from 10 to 14 km, which corresponds to the Pliocene-Miocene sediments, and to the "oil and gas windows" according to the intended meaning of R0 ≈ 0,65-0,85%. Fluids of mud volcanoes comprise by the following phases - gas, water. Gas phase consists mainly of methane (99%) of heavy hydrocarbons (С2+ hydrocarbons), CO2, N2, inert components He, Ar. The content of the С2+ hydrocarbons in the gases of mud volcanoes associated with oil deposits is increased. Carbon isotopic composition of methane for the Lower Kura depression varies from -40 ‰ to -60 ‰. Water of mud volcanoes are represented by all four genetic types. However the most typical types of water are HCN type. According to the Mg-Li geothermometer formation of mud waters corresponds to the temperature range from 20 °C to 140 °C (PC2). The solid product emissions of mud volcanoes identified 90 minerals and 30 trace elements. As a result geochemical investigation, thermobaric and geological conditions, zone oil and gas generation - the prospect of the Lower Kura depression is projected to depths greater than 10 km.Keywords: geology, geochemistry, mud volcanoes, petroleum potential
Procedia PDF Downloads 3662425 Structural Inequality and Precarious Workforce: The Role of Labor Laws in Destabilizing the Labor Force in Iran
Authors: Iman Shabanzadeh
Abstract:
Over the last three decades, the main demands of the Iranian workforce have been focused on three areas: "The right to a decent wage", "The right to organize" and "The right to job security". In order to investigate and analyze this situation, the present study focuses on the component of job security. The purpose of the study is to figure out what mechanisms in Iran's Labor Law have led to the destabilization and undermining of workers' job security. The research method is descriptive-analytical. To collect information, library and document sources in the field of laws related to labor rights in Iran and, semi-structured interviews with experts have been used. In the data analysis stage, the qualitative content analysis method was also used. The trend analysis of the statistics related to the labor force situation in Iran in the last three decades shows that the employment structure has been facing an increase in the active population, but in the last decade, a large part of this population has been mainly active in the service sector, and contract-free enterprises, so a smaller share of this employment has insurance coverage and a larger share has underemployment. In this regard, the results of this study show that four contexts have been proposed as the main legal and executive mechanisms of labor instability in Iran, which are: 1) temporaryization of the labor force by providing different interpretations of labor law, 2) adjustment labor in the public sector and the emergence of manpower contracting companies, 3) the cessation of labor law protection of workers in small workshops and 4) the existence of numerous restrictions on the effective organization of workers. The theoretical conclusion of this article is that the main root of the challenges of the labor society and the destabilized workforce in Iran is the existence of structural inequalities in the field of labor security, whose traces can be seen in the legal provisions and executive regulations of this field.Keywords: inequality, precariat, temporaryization, labor force, labor law
Procedia PDF Downloads 612424 Action Potential of Lateral Geniculate Neurons at Low Threshold Currents: Simulation Study
Authors: Faris Tarlochan, Siva Mahesh Tangutooru
Abstract:
Lateral Geniculate Nucleus (LGN) is the relay center in the visual pathway as it receives most of the input information from retinal ganglion cells (RGC) and sends to visual cortex. Low threshold calcium currents (IT) at the membrane are the unique indicator to characterize this firing functionality of the LGN neurons gained by the RGC input. According to the LGN functional requirements such as functional mapping of RGC to LGN, the morphologies of the LGN neurons were developed. During the neurological disorders like glaucoma, the mapping between RGC and LGN is disconnected and hence stimulating LGN electrically using deep brain electrodes can restore the functionalities of LGN. A computational model was developed for simulating the LGN neurons with three predominant morphologies, each representing different functional mapping of RGC to LGN. The firings of action potentials at LGN neuron due to IT were characterized by varying the stimulation parameters, morphological parameters and orientation. A wide range of stimulation parameters (stimulus amplitude, duration and frequency) represents the various strengths of the electrical stimulation with different morphological parameters (soma size, dendrites size and structure). The orientation (0-1800) of LGN neuron with respect to the stimulating electrode represents the angle at which the extracellular deep brain stimulation towards LGN neuron is performed. A reduced dendrite structure was used in the model using Bush–Sejnowski algorithm to decrease the computational time while conserving its input resistance and total surface area. The major finding is that an input potential of 0.4 V is required to produce the action potential in the LGN neuron which is placed at 100 µm distance from the electrode. From this study, it can be concluded that the neuroprostheses under design would need to consider the capability of inducing at least 0.4V to produce action potentials in LGN.Keywords: Lateral Geniculate Nucleus, visual cortex, finite element, glaucoma, neuroprostheses
Procedia PDF Downloads 2792423 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores
Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan
Abstract:
Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics
Procedia PDF Downloads 1302422 The Pioneering Model in Teaching Arabic as a Mother Tongue through Modern Innovative Strategies
Authors: Rima Abu Jaber Bransi, Rawya Jarjoura Burbara
Abstract:
This study deals with two pioneering approaches in teaching Arabic as a mother tongue: first, computerization of literary and functional texts in the mother tongue; second, the pioneering model in teaching writing skills by computerization. The significance of the study lies in its treatment of a serious problem that is faced in the era of technology, which is the widening gap between the pupils and their mother tongue. The innovation in the study is that it introduces modern methods and tools and a pioneering instructional model that turns the process of mother tongue teaching into an effective, meaningful, interesting and motivating experience. In view of the Arabic language diglossia, standard Arabic and spoken Arabic, which constitutes a serious problem to the pupil in understanding unused words, and in order to bridge the gap between the pupils and their mother tongue, we resorted to computerized techniques; we took texts from the pre-Islamic period (Jahiliyya), starting with the Mu'allaqa of Imru' al-Qais and other selected functional texts and computerized them for teaching in an interesting way that saves time and effort, develops high thinking strategies, expands the literary good taste among the pupils, and gives the text added values that neither the book, the blackboard, the teacher nor the worksheets provide. On the other hand, we have developed a pioneering computerized model that aims to develop the pupil's ability to think, to provide his imagination with the elements of growth, invention and connection, and motivate him to be creative, and raise level of his scores and scholastic achievements. The model consists of four basic stages in teaching according to the following order: 1. The Preparatory stage, 2. The reading comprehension stage, 3. The writing stage, 4. The evaluation stage. Our lecture will introduce a detailed description of the model with illustrations and samples from the units that we built through highlighting some aspects of the uniqueness and innovation that are specific to this model and the different integrated tools and techniques that we developed. One of the most significant conclusions of this research is that teaching languages through the employment of new computerized strategies is very likely to get the Arabic speaking pupils out of the circle of passive reception into active and serious action and interaction. The study also emphasizes the argument that the computerized model of teaching can change the role of the pupil's mind from being a store of knowledge for a short time into a partner in producing knowledge and storing it in a coherent way that prevents its forgetfulness and keeping it in memory for a long period of time. Consequently, the learners also turn into partners in evaluation by expressing their views, giving their notes and observations, and application of the method of peer-teaching and learning.Keywords: classical poetry, computerization, diglossia, writing skill
Procedia PDF Downloads 2252421 An Appraisal of Mitigation and Adaptation Measures under Paris Agreement 2015: Developing Nations' Pie
Authors: Olubisi Friday Oluduro
Abstract:
The Paris Agreement 2015, the result of negotiations under the United Nations Framework Convention on Climate Change (UNFCCC), after Kyoto Protocol expiration, sets a long-term goal of limiting the increase in the global average temperature to well below 2 degrees Celsius above pre-industrial levels, and of pursuing efforts to limiting this temperature increase to 1.5 degrees Celsius. An advancement on the erstwhile Kyoto Protocol which sets commitments to only a limited number of Parties to reduce their greenhouse gas (GHGs) emissions, it includes the goal to increase the ability to adapt to the adverse impacts of climate change and to make finance flows consistent with a pathway towards low GHGs emissions. For it achieve these goals, the Agreement requires all Parties to undertake efforts towards reaching global peaking of GHG emissions as soon as possible and towards achieving a balance between anthropogenic emissions by sources and removals by sinks in the second half of the twenty-first century. In addition to climate change mitigation, the Agreement aims at enhancing adaptive capacity, strengthening resilience and reducing the vulnerability to climate change in different parts of the world. It acknowledges the importance of addressing loss and damage associated with the adverse of climate change. The Agreement also contains comprehensive provisions on support to be provided to developing countries, which includes finance, technology transfer and capacity building. To ensure that such supports and actions are transparent, the Agreement contains a number reporting provisions, requiring parties to choose the efforts and measures that mostly suit them (Nationally Determined Contributions), providing for a mechanism of assessing progress and increasing global ambition over time by a regular global stocktake. Despite the somewhat global look of the Agreement, it has been fraught with manifold limitations threatening its very existential capability to produce any meaningful result. Considering these obvious limitations some of which were the very cause of the failure of its predecessor—the Kyoto Protocol—such as the non-participation of the United States, non-payment of funds into the various coffers for appropriate strategic purposes, among others. These have left the developing countries largely threatened eve the more, being more vulnerable than the developed countries, which are really responsible for the climate change scourge. The paper seeks to examine the mitigation and adaptation measures under the Paris Agreement 2015, appraise the present situation since the Agreement was concluded and ascertain whether the developing countries have been better or worse off since the Agreement was concluded, and examine why and how, while projecting a way forward in the present circumstance. It would conclude with recommendations towards ameliorating the situation.Keywords: mitigation, adaptation, climate change, Paris agreement 2015, framework
Procedia PDF Downloads 1572420 Enhancing Higher Education Teaching and Learning Processes: Examining How Lecturer Evaluation Make a Difference
Authors: Daniel Asiamah Ameyaw
Abstract:
This research attempts to investigate how lecturer evaluation makes a difference in enhancing higher education teaching and learning processes. The research questions to guide this research work states first as, “What are the perspectives on the difference made by evaluating academic teachers in order to enhance higher education teaching and learning processes?” and second, “What are the implications of the findings for Policy and Practice?” Data for this research was collected mainly through interviewing and partly documents review. Data analysis was conducted under the framework of grounded theory. The findings showed that for individual lecturer level, lecturer evaluation provides a continuous improvement of teaching strategies, and serves as source of data for research on teaching. At the individual student level, it enhances students learning process; serving as source of information for course selection by students; and by making students feel recognised in the educational process. At the institutional level, it noted that lecturer evaluation is useful in personnel and management decision making; it assures stakeholders of quality teaching and learning by setting up standards for lecturers; and it enables institutions to identify skill requirement and needs as a basis for organising workshops. Lecturer evaluation is useful at national level in terms of guaranteeing the competencies of graduates who then provide the needed manpower requirement of the nation. Besides, it mentioned that resource allocation to higher educational institution is based largely on quality of the programmes being run by the institution. The researcher concluded, that the findings have implications for policy and practice, therefore, higher education managers are expected to ensure that policy is implemented as planned by policy-makers so that the objectives can successfully be achieved.Keywords: academic quality, higher education, lecturer evaluation, teaching and learning processes
Procedia PDF Downloads 1432419 Evaluating the Water Balance of Sokoto Basement Complex to Address Water Security Challenges
Authors: Murtala Gada Abubakar, Aliyu T. Umar
Abstract:
A substantial part of Nigeria is part of semi-arid areas of the world, underlain by basement complex (hard) rocks which are very poor in both transmission and storage of appreciable quantity of water. Recently, a growing attention is being paid on the need to develop water resources in these areas largely due to concerns about increasing droughts and the need to maintain water security challenges. While there is ample body of knowledge that captures the hydrological behaviours of the sedimentary part, reported research which unambiguously illustrates water distribution in the basement complex of the Sokoto basin remains sparse. Considering the growing need to meet the water requirements of those living in this region necessitated the call for accurate water balance estimations that can inform a sustainable planning and development to address water security challenges for the area. To meet this task, a one-dimensional soil water balance model was developed and utilised to assess the state of water distribution within the Sokoto basin basement complex using measured meteorological variables and information about different landscapes within the complex. The model simulated the soil water storage and rates of input and output of water in response to climate and irrigation where applicable using data from 2001 to 2010 inclusive. The results revealed areas within the Sokoto basin basement complex that are rich and deficient in groundwater resource. The high potential areas identified includes the fadama, the fractured rocks and the cultivated lands, while the low potential areas are the sealed surfaces and non-fractured rocks. This study concludes that the modelling approach is a useful tool for assessing the hydrological behaviour and for better understanding the water resource availability within a basement complex.Keywords: basement complex, hydrological processes, Sokoto Basin, water security
Procedia PDF Downloads 3192418 Analysis of Relationship between Social Media Conversation and Mainstream Coverage to Mobilize Social Movement
Authors: Sakulsri Srisaracam
Abstract:
Social media has become an important source of information for the public and the media profession. Some social issues raised on social media are picked up by journalists to report on other platforms. This relationship between social media and mainstream media can sometimes drive public debate or stimulate social movements. The question to examine is in what situations can social media conversations raise awareness and stimulate change on public issues. This study addresses the communication patterns of social media conversations driving covert issues into mainstream media and leading to social advocacy movements. In methodological terms, the study findings are based on a content analysis of Facebook, Twitter, news websites and television media reports on three different case studies – saving Bryde’s whale, protests against a government proposal to downsize the Office of Knowledge Management and Development in Thailand, and a dengue fever campaign. These case studies were chosen because they represent issues that most members of the public do not pay much attention to but social media conversations stimulated public debate and calls to action. This study found: 1) Collective social media conversations can stimulate public debate and encourage change at three levels – awareness, public debate, and action of policy and social change. The level depends on the communication patterns of online users and media coverage. 2) Patterns of communication have to be designed to combine social media conversations, online opinion leaders, mainstream media coverage and call to both online and offline action to motivate social change. Thus, this result suggests that social media is a powerful platform for collective communication and setting the agenda on public issues for mainstream media. However, for social change to succeed, social media should be used to mobilize online movements to move offline too.Keywords: public issues, mainstream media, social media, social movement
Procedia PDF Downloads 2822417 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm
Authors: Annalakshmi G., Sakthivel Murugan S.
Abstract:
This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization
Procedia PDF Downloads 1632416 The Effect of Taekwondo on Plantar Pressure Distribution and Arch Index
Authors: Maryam Kakavand, Samira Entezari, Sara Khoshjamalfekri, Raghad Mimar
Abstract:
The objective of this study is 1) to compare elite female and beginner taekwondo players in terms of plantar pressure distribution, vertical ground reaction force, contact area, mean pressure, and right and left longitudinal arches, and 2) to compare preferred and non-preferred limbs among elite players. To the best of authors’ knowledge, as of yet, there is no information available about the plantar pressure distribution and arch index among taekwondo players. Material and Methods: An analytical-comparative research method is applied. Therefore seven elite athletes and eight novice athletes were selected. The emed-C50 platform was used to assess plantar pressure distribution, vertical ground reaction force, contact area, mean pressure of different areas, and planter longitudinal arch in a second step protocol. Independent t-test and dependent t-test were used at a level of 0.05 to compare the elites and beginners' right and left feet, and preferred and non-preferred limbs among elite athletes, respectively. Results: In comparing the right and left limbs of elite and beginner groups, findings indicate that there is only a significant difference in the mean pressure of the first metatarsal of the right foot. Findings also showed a significant difference in the contact area of the toes 3, 4, 5 regions between elites’ preferred and non-preferred limbs. However, no significant difference was found between the two groups’ right and left limbs and elites’ preferred and non-preferred limbs in terms of pressure distribution, vertical ground reaction force, and arch index. Conclusion: It seems that taekwondo exercises have affected pressure distribution patterns among advanced players causing some differences in their planter pressure distribution pattern when compared to that of beginners. Therefore, taekwondo exercises may be a factor contributing to asymmetry performance in preferred and non-preferred limbs.Keywords: planter pressure, arch index, taekwondo, elite
Procedia PDF Downloads 1542415 A Machine Learning Approach for Detecting and Locating Hardware Trojans
Authors: Kaiwen Zheng, Wanting Zhou, Nan Tang, Lei Li, Yuanhang He
Abstract:
The integrated circuit industry has become a cornerstone of the information society, finding widespread application in areas such as industry, communication, medicine, and aerospace. However, with the increasing complexity of integrated circuits, Hardware Trojans (HTs) implanted by attackers have become a significant threat to their security. In this paper, we proposed a hardware trojan detection method for large-scale circuits. As HTs introduce physical characteristic changes such as structure, area, and power consumption as additional redundant circuits, we proposed a machine-learning-based hardware trojan detection method based on the physical characteristics of gate-level netlists. This method transforms the hardware trojan detection problem into a machine-learning binary classification problem based on physical characteristics, greatly improving detection speed. To address the problem of imbalanced data, where the number of pure circuit samples is far less than that of HTs circuit samples, we used the SMOTETomek algorithm to expand the dataset and further improve the performance of the classifier. We used three machine learning algorithms, K-Nearest Neighbors, Random Forest, and Support Vector Machine, to train and validate benchmark circuits on Trust-Hub, and all achieved good results. In our case studies based on AES encryption circuits provided by trust-hub, the test results showed the effectiveness of the proposed method. To further validate the method’s effectiveness for detecting variant HTs, we designed variant HTs using open-source HTs. The proposed method can guarantee robust detection accuracy in the millisecond level detection time for IC, and FPGA design flows and has good detection performance for library variant HTs.Keywords: hardware trojans, physical properties, machine learning, hardware security
Procedia PDF Downloads 1472414 Exploring Instructional Designs on the Socio-Scientific Issues-Based Learning Method in Respect to STEM Education for Measuring Reasonable Ethics on Electromagnetic Wave through Science Attitudes toward Physics
Authors: Adisorn Banhan, Toansakul Santiboon, Prasong Saihong
Abstract:
Using the Socio-Scientific Issues-Based Learning Method is to compare of the blended instruction of STEM education with a sample consisted of 84 students in 2 classes at the 11th grade level in Sarakham Pittayakhom School. The 2-instructional models were managed of five instructional lesson plans in the context of electronic wave issue. These research procedures were designed of each instructional method through two groups, the 40-experimental student group was designed for the instructional STEM education (STEMe) and 40-controlling student group was administered with the Socio-Scientific Issues-Based Learning (SSIBL) methods. Associations between students’ learning achievements of each instructional method and their science attitudes of their predictions to their exploring activities toward physics with the STEMe and SSIBL methods were compared. The Measuring Reasonable Ethics Test (MRET) was assessed students’ reasonable ethics with the STEMe and SSIBL instructional design methods on two each group. Using the pretest and posttest technique to monitor and evaluate students’ performances of their reasonable ethics on electromagnetic wave issue in the STEMe and SSIBL instructional classes were examined. Students were observed and gained experience with the phenomena being studied with the Socio-Scientific Issues-Based Learning method Model. To support with the STEM that it was not just teaching about Science, Technology, Engineering, and Mathematics; it is a culture that needs to be cultivated to help create a problem solving, creative, critical thinking workforce for tomorrow in physics. Students’ attitudes were assessed with the Test Of Physics-Related Attitude (TOPRA) modified from the original Test Of Science-Related Attitude (TOSRA). Comparisons between students’ learning achievements of their different instructional methods on the STEMe and SSIBL were analyzed. Associations between students’ performances the STEMe and SSIBL instructional design methods of their reasonable ethics and their science attitudes toward physics were associated. These findings have found that the efficiency of the SSIBL and the STEMe innovations were based on criteria of the IOC value higher than evidence as 80/80 standard level. Statistically significant of students’ learning achievements to their later outcomes on the controlling and experimental groups with the SSIBL and STEMe were differentiated between students’ learning achievements at the .05 level. To compare between students’ reasonable ethics with the SSIBL and STEMe of students’ responses to their instructional activities in the STEMe is higher than the SSIBL instructional methods. Associations between students’ later learning achievements with the SSIBL and STEMe, the predictive efficiency values of the R2 indicate that 67% and 75% for the SSIBL, and indicate that 74% and 81% for the STEMe of the variances were attributable to their developing reasonable ethics and science attitudes toward physics, consequently.Keywords: socio-scientific issues-based learning method, STEM education, science attitudes, measurement, reasonable ethics, physics classes
Procedia PDF Downloads 2922413 Structural Protein-Protein Interactions Network of Breast Cancer Lung and Brain Metastasis Corroborates Conformational Changes of Proteins Lead to Different Signaling
Authors: Farideh Halakou, Emel Sen, Attila Gursoy, Ozlem Keskin
Abstract:
Protein–Protein Interactions (PPIs) mediate major biological processes in living cells. The study of PPIs as networks and analyze the network properties contribute to the identification of genes and proteins associated with diseases. In this study, we have created the sub-networks of brain and lung metastasis from primary tumor in breast cancer. To do so, we used seed genes known to cause metastasis, and produced their interactions through a network-topology based prioritization method named GUILDify. In order to have the experimental support for the sub-networks, we further curated them using STRING database. We proceeded by modeling structures for the interactions lacking complex forms in Protein Data Bank (PDB). The functional enrichment analysis shows that KEGG pathways associated with the immune system and infectious diseases, particularly the chemokine signaling pathway, are important for lung metastasis. On the other hand, pathways related to genetic information processing are more involved in brain metastasis. The structural analyses of the sub-networks vividly demonstrated their difference in terms of using specific interfaces in lung and brain metastasis. Furthermore, the topological analysis identified genes such as RPL5, MMP2, CCR5 and DPP4, which are already known to be associated with lung or brain metastasis. Additionally, we found 6 and 9 putative genes that are specific for lung and brain metastasis, respectively. Our analysis suggests that variations in genes and pathways contributing to these different breast metastasis types may arise due to change in tissue microenvironment. To show the benefits of using structural PPI networks instead of traditional node and edge presentation, we inspect two case studies showing the mutual exclusiveness of interactions and effects of mutations on protein conformation which lead to different signaling.Keywords: breast cancer, metastasis, PPI networks, protein conformational changes
Procedia PDF Downloads 2442412 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines
Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto
Abstract:
Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.Keywords: aerial image, landcover, LiDAR, soil fertility degradation
Procedia PDF Downloads 2522411 Experimental and Numerical Investigation of Fracture Behavior of Foamed Concrete Based on Three-Point Bending Test of Beams with Initial Notch
Authors: M. Kozłowski, M. Kadela
Abstract:
Foamed concrete is known for its low self-weight and excellent thermal and acoustic properties. For many years, it has been used worldwide for insulation to foundations and roof tiles, as backfill to retaining walls, sound insulation, etc. However, in the last years it has become a promising material also for structural purposes e.g. for stabilization of weak soils. Due to favorable properties of foamed concrete, many interests and studies were involved to analyze its strength, mechanical, thermal and acoustic properties. However, these studies do not cover the investigation of fracture energy which is the core factor governing the damage and fracture mechanisms. Only limited number of publications can be found in literature. The paper presents the results of experimental investigation and numerical campaign of foamed concrete based on three-point bending test of beams with initial notch. First part of the paper presents the results of a series of static loading tests performed to investigate the fracture properties of foamed concrete of varying density. Beam specimens with dimensions of 100×100×840 mm with a central notch were tested in three-point bending. Subsequently, remaining halves of the specimens with dimensions of 100×100×420 mm were tested again as un-notched beams in the same set-up with reduced distance between supports. The tests were performed in a hydraulic displacement controlled testing machine with a load capacity of 5 kN. Apart from measuring the loading and mid-span displacement, a crack mouth opening displacement (CMOD) was monitored. Based on the load – displacement curves of notched beams the values of fracture energy and tensile stress at failure were calculated. The flexural tensile strength was obtained on un-notched beams with dimensions of 100×100×420 mm. Moreover, cube specimens 150×150×150 mm were tested in compression to determine the compressive strength. Second part of the paper deals with numerical investigation of the fracture behavior of beams with initial notch presented in the first part of the paper. Extended Finite Element Method (XFEM) was used to simulate and analyze the damage and fracture process. The influence of meshing and variation of mechanical properties on results was investigated. Numerical models simulate correctly the behavior of beams observed during three-point bending. The numerical results show that XFEM can be used to simulate different fracture toughness of foamed concrete and fracture types. Using the XFEM and computer simulation technology allow for reliable approximation of load–bearing capacity and damage mechanisms of beams made of foamed concrete, which provides some foundations for realistic structural applications.Keywords: foamed concrete, fracture energy, three-point bending, XFEM
Procedia PDF Downloads 3002410 A Comparative Analysis: Cultural Reflections of Mexicans in the United States and Turks in Germany
Authors: Gülşen Kocaevli
Abstract:
This paper aims to conduct a comparative analysis on the reflections of cultural elements such as language, festival, and food both in the case of Turkish immigrants in Germany and Mexican immigrants in the United States within a historical perspective. These reflections will be studied first by giving a certain background information on the migratory history of the two nations, Mexican immigration to the US, and Turkish immigration to Germany, respectively. These two cases were picked as the analytical subjects of this paper because both nations first migrated to the related country to constitute a labor force since there was a huge need for that due to several reasons such as the loss of manpower after certain wars or revolutions. At the end of this comparative study, it is speculated to be found that there are certain parallels between these two immigrant societies in the way that they reflect their cultures in the receiving country since both nations have a conventionalist nature which makes them tend more to protect their cultures and pay less effort to integrate into the society in which they are living. Even though this integration might be realized in certain fields like economic status and exogamy, it does not cover all segments nor is there any desire of the receiving government to integrate the immigrants but rather they make policies to assimilate them. This research paper will use a qualitative method which is fundamentally based on the interpretative data drawn from several sociological or ethnographic studies conducted in the related field. The primary and secondary resources of this paper will cover academic books, journal articles, particularly those reporting interviews with the immigrants, and certain governmental documents as well as publicized statistics regarding the subject of analysis. By the use of the aforementioned methodology and resources, the conventionalist nature of the two immigrant nations is aimed to be presented as the unifying factor in the way that Mexicans in the US and Turks in Germany reflect and protect their cultures in the form of language, festivals, and food.Keywords: assimilation, culture, German-Turks, immigration, Mexican Americans
Procedia PDF Downloads 1702409 Inflation and Unemployment Rates as Indicators of the Transition European Union Countries Monetary Policy Orientation
Authors: Elza Jurun, Damir Piplica, Tea Poklepović
Abstract:
Numerous studies carried out in the developed western democratic countries have shown that the ideological framework of the governing party has a significant influence on the monetary policy. The executive authority consisting of a left-wing party gives a higher weight to unemployment suppression and central bank implements a more expansionary monetary policy. On the other hand, right-wing governing party considers the monetary stability to be more important than unemployment suppression and in such a political framework the main macroeconomic objective becomes the inflation rate reduction. The political framework conditions in the transition countries which are new European Union (EU) members are still highly specific in relation to the other EU member countries. In the focus of this paper is the question whether the same monetary policy principles are valid in these transitional countries as well as they apply in developed western democratic EU member countries. The data base consists of inflation rate and unemployment rate for 11 transitional EU member countries covering the period from 2001 to 2012. The essential information for each of these 11 countries and for each year of the observed period is right or left political orientation of the ruling party. In this paper we use t-statistics to test our hypothesis that there are differences in inflation and unemployment between right and left political orientation of the governing party. To explore the influence of different countries, through years and different political orientations descriptive statistics is used. Inflation and unemployment should be strongly negatively correlated through time, which is tested using Pearson correlation coefficient. Regarding the fact whether the governing authority is consisted from left or right politically oriented parties, monetary authorities will adjust its policy setting the higher priority on lower inflation or unemployment reduction.Keywords: inflation rate, monetary policy orientation, transition EU countries, unemployment rate
Procedia PDF Downloads 4402408 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble
Authors: Jaehong Yu, Seoung Bum Kim
Abstract:
Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking
Procedia PDF Downloads 3392407 Congruency of English Teachers’ Assessments Vis-à-Vis 21st Century Skills Assessment Standards
Authors: Mary Jane Suarez
Abstract:
A massive educational overhaul has taken place at the onset of the 21st century addressing the mismatches of employability skills with that of scholastic skills taught in schools. For a community to thrive in an ever-developing economy, the teaching of the necessary skills for job competencies should be realized by every educational institution. However, in harnessing 21st-century skills amongst learners, teachers, who often lack familiarity and thorough insights into the emerging 21st-century skills, are chained with the restraint of the need to comprehend the physiognomies of 21st-century skills learning and the requisite to implement the tenets of 21st-century skills teaching. With the endeavor to espouse 21st-century skills learning and teaching, a United States-based national coalition called Partnership 21st Century Skills (P21) has identified the four most important skills in 21st-century learning: critical thinking, communication, collaboration, and creativity and innovation with an established framework for 21st-century skills standards. Assessment of skills is the lifeblood of every teaching and learning encounter. It is correspondingly crucial to look at the 21st century standards and the assessment guides recognized by P21 to ensure that learners are 21st century ready. This mixed-method study sought to discover and describe what classroom assessments were used by English teachers in a public secondary school in the Philippines with course offerings on science, technology, engineering, and mathematics (STEM). The research evaluated the assessment tools implemented by English teachers and how these assessment tools were congruent to the 21st assessment standards of P21. A convergent parallel design was used to analyze assessment tools and practices in four phases. In the data-gathering phase, survey questionnaires, document reviews, interviews, and classroom observations were used to gather quantitative and qualitative data simultaneously, and how assessment tools and practices were consistent with the P21 framework with the four Cs as its foci. In the analysis phase, the data were treated using mean, frequency, and percentage. In the merging and interpretation phases, a side-by-side comparison was used to identify convergent and divergent aspects of the results. In conclusion, the results yielded assessments tools and practices that were inconsistent, if not at all, used by teachers. Findings showed that there were inconsistencies in implementing authentic assessments, there was a scarcity of using a rubric to critically assess 21st skills in both language and literature subjects, there were incongruencies in using portfolio and self-reflective assessments, there was an exclusion of intercultural aspects in assessing the four Cs and the lack of integrating collaboration in formative and summative assessments. As a recommendation, a harmonized assessment scheme of P21 skills was fashioned for teachers to plan, implement, and monitor classroom assessments of 21st-century skills, ensuring the alignment of such assessments to P21 standards for the furtherance of the institution’s thrust to effectively integrate 21st-century skills assessment standards to its curricula.Keywords: 21st-century skills, 21st-century skills assessments, assessment standards, congruency, four Cs
Procedia PDF Downloads 1932406 Predictive Semi-Empirical NOx Model for Diesel Engine
Authors: Saurabh Sharma, Yong Sun, Bruce Vernham
Abstract:
Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model. Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.Keywords: diesel engine, machine learning, NOₓ emission, semi-empirical
Procedia PDF Downloads 1142405 Reconstruction of Wujiaochang Plaza: A Potential Avenue Towards Sustainability
Authors: Caiwei Chen, Jianhao Li, Jiasong Zhu
Abstract:
The reform and opening-up stimulated economic and technological take-off in China while resulting in massive urbanization and motorization. Wujiaochang area was set as a secondary business district in Shanghai to meet the growing demand, with the reconstruction of Wujiaochang Plaza in 2005 being a milestone of this intended urban renewal. Wujiaochang is now an economically dynamic area providing much larger traffic and transit capacity transportation-wise. However, this rebuilding has completely changed the face of the district. It is, therefore, appropriate to evaluate its impact on neighborhoods and communities while assessing the overall sustainability of such an operation. In this study, via an online questionnaire survey among local residents and daily visitors, we assess the perceptions and the estimated impact of Wujiaochang Plaza's reconstruction. We then confront these results to the 62 answers from local residents to a questionnaire collected on paper. The analysis of our data, along with observation and other forms of information -such as maps analysis or online applications (Dianping)- demonstrate major improvement in economic sustainability but also significant losses in environmental sustainability, especially in terms of active transportation. As for the social viewpoint, local residents' opinions tend to be rather positive, especially regarding traffic safety and access to consumption, despite the lack of connectivity and radical changes induced by Wujiaochang massive transformations. In general, our investigation exposes the overall positive outcomes of Wujiaochang Plaza reconstruction but also unveils major drawbacks, especially in terms of soft mobility and traffic fluidity. We gather that our approach could be of tremendous help for future major urban interventions, as such approaches in municipal regeneration are widely implemented in Chinese cities and yet still need to be thoroughly assessed in terms of sustainability.Keywords: China's reform and opening-up, economical revitalization, neighborhood identity, sustainability assessment, urban renewal
Procedia PDF Downloads 2372404 The Role of Pharmacist in The Community: A Study of Methanol Toxicity Disaster in Tripoli Libya During March 2013
Authors: Abdurrauf M. Gusbi, Mahmud H. Arhima, Abdurrahim A. Elouzi, Ebtisam A. Benomran, Salsabeela Elmezwghi, Aram Elhatan, Nafesa Elgusbi
Abstract:
Mass poisonings with methanol are rare but occur regularly both in developed and in non-developing countries. As a result of the tragedy that happened in the city of Tripoli Libya in March during year 2013 a number of patients were admitted to Tripoli Medical Center and Tripoli Central Hospital suffering from poisoning following ingestion of methanol by mistake. Our aims have been formulated to collect Information about those cases as much as we can from the archiving departments from the two hospitals including the number of cases that had been admitted, recovered patients and died victims. This retrospective study was planned to find out the reasons which allow those patients to drink methanol in our Muslim community and also the role of pharmacist to prevent such a disaster that claimed the lives of many people. During this tragedy 291 ospitalized patients their ages between 16-32 years old were admitted to both hospitals, total number of died 189 (121 at Tripoli medical center) and (68 at Tripoli central hospital), demographic data also shows that most of them are male (97%) and (3% female), about 4% of the patients foreigners and 96% were Libyans. There were a lot of obstacles and poor facilities at the time of patient admission as recognized in many cases including lack of first line of treatment. The morbidity was high due to the lack of antidote and availability of dialysis machines at this two main hospitals in Tripoli also according to survey done to the medical staff and also a random number of medical students shows about 28% have no idea about the first aid procedure used for methanol poisoning cases and this due to the absence of continuing education for all medical staff through the establishment of training courses on first aid, rapid diagnosis of poisoning and follow the written procedures to dealing with such cases.Keywords: ethanol, fomepizole, methanol, poisoning
Procedia PDF Downloads 3642403 Characterization of Chest Pain in Patients Consulting to the Emergency Department of a Health Institution High Level of Complexity during 2014-2015, Medellin, Colombia
Authors: Jorge Iván Bañol-Betancur, Lina María Martínez-Sánchez, María de los Ángeles Rodríguez-Gázquez, Estefanía Bahamonde-Olaya, Ana María Gutiérrez-Tamayo, Laura Isabel Jaramillo-Jaramillo, Camilo Ruiz-Mejía, Natalia Morales-Quintero
Abstract:
Acute chest pain is a distressing sensation between the diaphragm and the base of the neck and it represents a diagnostic challenge for any physician in the emergency department. Objective: To establish the main clinical and epidemiological characteristics of patients who present with chest pain to the emergency department in a private clinic from the city of Medellin, during 2014-2015. Methods: Cross-sectional retrospective observational study. Population and sample were patients who consulted for chest pain in the emergency department who met the eligibility criteria. The information was analyzed in SPSS program vr.21; qualitative variables were described through relative frequencies, and the quantitative through mean and standard deviation or medians according to their distribution in the study population. Results: A total of 231 patients were evaluated, the mean age was 49.5 ± 19.9 years, 56.7% were females. The most frequent pathological antecedents were hypertension 35.5%, diabetes 10,8%, dyslipidemia 10.4% and coronary disease 5.2%. Regarding pain features, in 40.3% of the patients the pain began abruptly, in 38.2% it had a precordial location, for 20% of the cases physical activity acted as a trigger, and 60.6% was oppressive. Costochondritis was the most common cause of chest pain among patients with an established etiologic diagnosis, representing the 18.2%. Conclusions: Although the clinical features of pain reported coincide with the clinical presentation of an acute coronary syndrome, the most common cause of chest pain in study population was costochondritis instead, indicating that it is a differential diagnostic in the approach of patients with pain acute chest.Keywords: acute coronary syndrome, chest pain, epidemiology, osteochondritis
Procedia PDF Downloads 3432402 The Invaluable Contributions of Radiography and Radiotherapy in Modern Medicine
Authors: Sahar Heidary
Abstract:
Radiography and radiotherapy have emerged as crucial pillars of modern medical practice, revolutionizing diagnostics and treatment for a myriad of health conditions. This abstract highlights the pivotal role of radiography and radiotherapy in favor of healthcare and society. Radiography, a non-invasive imaging technique, has significantly advanced medical diagnostics by enabling the visualization of internal structures and abnormalities within the human body. With the advent of digital radiography, clinicians can obtain high-resolution images promptly, leading to faster diagnoses and informed treatment decisions. Radiography plays a pivotal role in detecting fractures, tumors, infections, and various other conditions, allowing for timely interventions and improved patient outcomes. Moreover, its widespread accessibility and cost-effectiveness make it an indispensable tool in healthcare settings worldwide. On the other hand, radiotherapy, a branch of medical science that utilizes high-energy radiation, has become an integral component of cancer treatment and management. By precisely targeting and damaging cancerous cells, radiotherapy offers a potent strategy to control tumor growth and, in many cases, leads to cancer eradication. Additionally, radiotherapy is often used in combination with surgery and chemotherapy, providing a multifaceted approach to combat cancer comprehensively. The continuous advancements in radiotherapy techniques, such as intensity-modulated radiotherapy and stereotactic radiosurgery, have further improved treatment precision while minimizing damage to surrounding healthy tissues. Furthermore, radiography and radiotherapy have demonstrated their worth beyond oncology. Radiography is instrumental in guiding various medical procedures, including catheter placement, joint injections, and dental evaluations, reducing complications and enhancing procedural accuracy. On the other hand, radiotherapy finds applications in non-cancerous conditions like benign tumors, vascular malformations, and certain neurological disorders, offering therapeutic options for patients who may not benefit from traditional surgical interventions. In conclusion, radiography and radiotherapy stand as indispensable tools in modern medicine, driving transformative improvements in patient care and treatment outcomes. Their ability to diagnose, treat, and manage a wide array of medical conditions underscores their favor in medical practice. As technology continues to advance, radiography and radiotherapy will undoubtedly play an ever more significant role in shaping the future of healthcare, ultimately saving lives and enhancing the quality of life for countless individuals worldwide.Keywords: radiology, radiotherapy, medical imaging, cancer treatment
Procedia PDF Downloads 692401 Impact of Marine Hydrodynamics and Coastal Morphology on Changes in Mangrove Forests (Case Study: West of Strait of Hormuz, Iran)
Authors: Fatemeh Parhizkar, Mojtaba Yamani, Abdolla Behboodi, Masoomeh Hashemi
Abstract:
The mangrove forests are natural and valuable gifts that exist in some parts of the world, including Iran. Regarding the threats faced by these forests and the declining area of them all over the world, as well as in Iran, it is very necessary to manage and monitor them. The current study aimed to investigate the changes in mangrove forests and the relationship between these changes and the marine hydrodynamics and coastal morphology in the area between qeshm island and the west coast of the Hormozgan province (i.e. the coastline between Mehran river and Bandar-e Pol port) in the 49-year period. After preprocessing and classifying satellite images using the SVM, MLC, and ANN classifiers and evaluating the accuracy of the maps, the SVM approach with the highest accuracy (the Kappa coefficient of 0.97 and overall accuracy of 98) was selected for preparing the classification map of all images. The results indicate that from 1972 to 1987, the area of these forests have had experienced a declining trend, and in the next years, their expansion was initiated. These forests include the mangrove forests of Khurkhuran wetland, Muriz Deraz Estuary, Haft Baram Estuary, the mangrove forest in the south of the Laft Port, and the mangrove forests between the Tabl Pier, Maleki Village, and Gevarzin Village. The marine hydrodynamic and geomorphological characteristics of the region, such as average intertidal zone, sediment data, the freshwater inlet of Mehran river, wave stability and calmness, topography and slope, as well as mangrove conservation projects make the further expansion of mangrove forests in this area possible. By providing significant and up-to-date information on the development and decline of mangrove forests in different parts of the coast, this study can significantly contribute to taking measures for the conservation and restoration of mangrove forests.Keywords: mangrove forests, marine hydrodynamics, coastal morphology, west of strait of Hormuz, Iran
Procedia PDF Downloads 95