Search results for: criteria of similarity
2991 Graph-Based Semantical Extractive Text Analysis
Authors: Mina Samizadeh
Abstract:
In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis
Procedia PDF Downloads 702990 Fuzzy Vehicle Routing Problem for Extreme Environment
Authors: G. Sirbiladze, B. Ghvaberidze, B. Matsaberidze
Abstract:
A fuzzy vehicle routing problem is considered in the possibilistic environment. A new criterion, maximization of expectation of reliability for movement on closed routes is constructed. The objective of the research is to implement a two-stage scheme for solution of this problem. Based on the algorithm of preferences on the first stage, the sample of so-called “promising” routes will be selected. On the second stage, for the selected promising routes new bi-criteria problem will be solved - minimization of total traveled distance and maximization of reliability of routes. The problem will be stated as a fuzzy-partitioning problem. Two possible solutions of this scheme are considered.Keywords: vehicle routing problem, fuzzy partitioning problem, multiple-criteria optimization, possibility theory
Procedia PDF Downloads 5472989 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller
Authors: Alireza Dantism
Abstract:
Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller
Procedia PDF Downloads 972988 The Concentration Analysis of CO2 Using ALOHA Code for Kuosheng Nuclear Power Plant
Authors: W. S. Hsu, Y. Chiang, H. C. Chen, J. R. Wang, S. W. Chen, J. H. Yang, C. Shih
Abstract:
Not only radiation materials, but also the normal chemical material stored in the power plant can cause a risk to the residents. In this research, the ALOHA code was used to perform the concentration analysis under the CO2 storage burst or leakage conditions for Kuosheng nuclear power plant (NPP). The Final Safety Analysis Report (FSAR) and data were used in this study. Additionally, the analysis results of ALOHA code were compared with the R.G. 1.78 failure criteria in order to confirm the control room habitability. The comparison results show that the ALOHA result for burst case was 0.923 g/m3 which was below the criteria. However, the ALOHA results for leakage case was 11.3 g/m3.Keywords: BWR, ALOHA, habitability, Kuosheng
Procedia PDF Downloads 3592987 Neural Graph Matching for Modification Similarity Applied to Electronic Document Comparison
Authors: Po-Fang Hsu, Chiching Wei
Abstract:
In this paper, we present a novel neural graph matching approach applied to document comparison. Document comparison is a common task in the legal and financial industries. In some cases, the most important differences may be the addition or omission of words, sentences, clauses, or paragraphs. However, it is a challenging task without recording or tracing the whole edited process. Under many temporal uncertainties, we explore the potentiality of our approach to proximate the accurate comparison to make sure which element blocks have a relation of edition with others. In the beginning, we apply a document layout analysis that combines traditional and modern technics to segment layouts in blocks of various types appropriately. Then we transform this issue into a problem of layout graph matching with textual awareness. Regarding graph matching, it is a long-studied problem with a broad range of applications. However, different from previous works focusing on visual images or structural layout, we also bring textual features into our model for adapting this domain. Specifically, based on the electronic document, we introduce an encoder to deal with the visual presentation decoding from PDF. Additionally, because the modifications can cause the inconsistency of document layout analysis between modified documents and the blocks can be merged and split, Sinkhorn divergence is adopted in our neural graph approach, which tries to overcome both these issues with many-to-many block matching. We demonstrate this on two categories of layouts, as follows., legal agreement and scientific articles, collected from our real-case datasets.Keywords: document comparison, graph matching, graph neural network, modification similarity, multi-modal
Procedia PDF Downloads 1792986 Projectification: Using Project Management Methodology to Manage the Academic Program Review
Authors: Adam Marks, Munir Majdalawieh, Maytha Al Ali
Abstract:
While research is rich with what criteria could be included in the academic program review processes, there is rarely any mention of how this significant and complex process should be managed. This paper proposes using project management methodology in alignment with the program review criteria of the Dickeson’s Prioritizing Academic Programs model. Project management and academic program review share two distinct characteristics; one is their life cycle, and the second is the core knowledge areas they use. This aligned and structured approach offers academic administrators a step-by-step guide that can help them manage this process and effectively assess academic programs.Keywords: project management, academic program, program review, education, higher education institution, strategic management
Procedia PDF Downloads 3672985 Site Suitability of Offshore Wind Energy: A Combination of Geographic Referenced Information and Analytic Hierarchy Process
Authors: Ayat-Allah Bouramdane
Abstract:
Power generation from offshore wind energy does not emit carbon dioxide or other air pollutants and therefore play a role in reducing greenhouse gas emissions from the energy sector. In addition, these systems are considered more efficient than onshore wind farms, as they generate electricity from the wind blowing across the sea, thanks to the higher wind speed and greater consistency in direction due to the lack of physical interference that the land or human-made objects can present. This means offshore installations require fewer turbines to produce the same amount of energy as onshore wind farms. However, offshore wind farms require more complex infrastructure to support them and, as a result, are more expensive to construct. In addition, higher wind speeds, strong seas, and accessibility issues makes offshore wind farms more challenging to maintain. This study uses a combination of Geographic Referenced Information (GRI) and Analytic Hierarchy Process (AHP) to identify the most suitable sites for offshore wind farm development in Morocco, with a particular focus on the Dakhla city. A range of environmental, socio-economic, and technical criteria are taken into account to solve this complex Multi-Criteria Decision-Making (MCDM) problem. Based on experts' knowledge, a pairwise comparison matrix at each level of the hierarchy is performed, and fourteen sub-criteria belong to the main criteria have been weighted to generate the site suitability of offshore wind plants and obtain an in-depth knowledge on unsuitable areas, and areas with low-, moderate-, high- and very high suitability. We find that wind speed is the most decisive criteria in offshore wind farm development, followed by bathymetry, while proximity to facilities, the sediment thickness, and the remaining parameters show much lower weightings rendering technical parameters most decisive in offshore wind farm development projects. We also discuss the potential of other marine renewable energy potential, in Morocco, such as wave and tidal energy. The proposed approach and analysis can help decision-makers and can be applied to other countries in order to support the site selection process of offshore wind farms.Keywords: analytic hierarchy process, dakhla, geographic referenced information, morocco, multi-criteria decision-making, offshore wind, site suitability
Procedia PDF Downloads 1572984 A Mathematical Study of Magnetic Field, Heat Transfer and Brownian Motion of Nanofluid over a Nonlinear Stretching Sheet
Authors: Madhu Aneja, Sapna Sharma
Abstract:
Thermal conductivity of ordinary heat transfer fluids is not adequate to meet today’s cooling rate requirements. Nanoparticles have been shown to increase the thermal conductivity and convective heat transfer to the base fluids. One of the possible mechanisms for anomalous increase in the thermal conductivity of nanofluids is the Brownian motions of the nanoparticles in the basefluid. In this paper, the natural convection of incompressible nanofluid over a nonlinear stretching sheet in the presence of magnetic field is studied. The flow and heat transfer induced by stretching sheets is important in the study of extrusion processes and is a subject of considerable interest in the contemporary literature. Appropriate similarity variables are used to transform the governing nonlinear partial differential equations to a system of nonlinear ordinary (similarity) differential equations. For computational purpose, Finite Element Method is used. The effective thermal conductivity and viscosity of nanofluid are calculated by KKL (Koo – Klienstreuer – Li) correlation. In this model effect of Brownian motion on thermal conductivity is considered. The effect of important parameter i.e. nonlinear parameter, volume fraction, Hartmann number, heat source parameter is studied on velocity and temperature. Skin friction and heat transfer coefficients are also calculated for concerned parameters.Keywords: Brownian motion, convection, finite element method, magnetic field, nanofluid, stretching sheet
Procedia PDF Downloads 2182983 Viscoelastic Modeling of Hot Mix Asphalt (HMA) under Repeated Loading by Using Finite Element Method
Authors: S. A. Tabatabaei, S. Aarabi
Abstract:
Predicting the hot mix asphalt (HMA) response and performance is a challenging task because of the subjectivity of HMA under the complex loading and environmental condition. The behavior of HMA is a function of temperature of loading and also shows the time and rate-dependent behavior directly affecting design criteria of mixture. Velocity of load passing make the time and rate. The viscoelasticity illustrates the reaction of HMA under loading and environmental conditions such as temperature and moisture effect. The behavior has direct effect on design criteria such as tensional strain and vertical deflection. In this paper, the computational framework for viscoelasticity and implementation in 3D dimensional HMA model is introduced to use in finite element method. The model was lied under various repeated loading conditions at constant temperature. The response of HMA viscoelastic behavior is investigated in loading condition under speed vehicle and sensitivity of behavior to the range of speed and compared to HMA which is supposed to have elastic behavior as in conventional design methods. The results show the importance of loading time pulse, unloading time and various speeds on design criteria. Also the importance of memory fading of material to storing the strain and stress due to repeated loading was shown. The model was simulated by ABAQUS finite element packageKeywords: viscoelasticity, finite element method, repeated loading, HMA
Procedia PDF Downloads 3982982 Dams Operation Management Criteria during Floods: Case Study of Dez Dam in Southwest Iran
Authors: Ali Heidari
Abstract:
This paper presents the principles for improving flood mitigation operation in multipurpose dams and maximizing reservoir performance during flood occurrence with a focus on the real-time operation of gated spillways. The criteria of operation include the safety of dams during flood management, minimizing the downstream flood risk by decreasing the flood hazard and fulfilling water supply and other purposes of the dam operation in mid and long terms horizons. The parameters deemed to be important include flood inflow, outlet capacity restrictions, downstream flood inundation damages, economic revenue of dam operation, and environmental and sedimentation restrictions. A simulation model was used to determine the real-time release of the Dez dam located in the Dez rivers in southwest Iran, considering the gate regulation curves for the gated spillway. The results of the simulation model show that there is a possibility to improve the current procedures used in the real-time operation of the dams, particularly using gate regulation curves and early flood forecasting system results. The Dez dam operation data shows that in one of the best flood control records, % 17 of the total active volume and flood control pool of the reservoir have not been used in decreasing the downstream flood hazard despite the availability of a flood forecasting system.Keywords: dam operation, flood control criteria, Dez dam, Iran
Procedia PDF Downloads 2252981 Identification of a Lead Compound for Selective Inhibition of Nav1.7 to Treat Chronic Pain
Authors: Sharat Chandra, Zilong Wang, Ru-Rong Ji, Andrey Bortsov
Abstract:
Chronic pain (CP) therapeutic approaches have limited efficacy. As a result, doctors are prescribing opioids for chronic pain, leading to opioid overuse, abuse, and addiction epidemic. Therefore, the development of effective and safe CP drugs remains an unmet medical need. Voltage-gated sodium (Nav) channels act as cardiovascular and neurological disorder’s molecular targets. Nav channels selective inhibitors are hard to design because there are nine closely-related isoforms (Nav1.1-1.9) that share the protein sequence segments. We are targeting the Nav1.7 found in the peripheral nervous system and engaged in the perception of pain. The objective of this project was to screen a 1.5 million compound library for identification of inhibitors for Nav1.7 with analgesic effect. In this study, we designed a protocol for identification of isoform-selective inhibitors of Nav1.7, by utilizing the prior information on isoform-selective antagonists. First, a similarity search was performed; then the identified hits were docked into a binding site on the fourth voltage-sensor domain (VSD4) of Nav1.7. We used the FTrees tool for similarity searching and library generation; the generated library was docked in the VSD4 domain binding site using FlexX and compounds were shortlisted using a FlexX score and SeeSAR hyde scoring. Finally, the top 25 compounds were tested with molecular dynamics simulation (MDS). We reduced our list to 9 compounds based on the MDS root mean square deviation plot and obtained them from a vendor for in vitro and in vivo validation. Whole-cell patch-clamp recordings in HEK-293 cells and dorsal root ganglion neurons were conducted. We used patch pipettes to record transient Na⁺ currents. One of the compounds reduced the peak sodium currents in Nav1.7-HEK-293 stable cell line in a dose-dependent manner, with IC50 values at 0.74 µM. In summary, our computer-aided analgesic discovery approach allowed us to develop pre-clinical analgesic candidate with significant reduction of time and cost.Keywords: chronic pain, voltage-gated sodium channel, isoform-selective antagonist, similarity search, virtual screening, analgesics development
Procedia PDF Downloads 1232980 Syntactic Analyzer for Tamil Language
Authors: Franklin Thambi Jose.S
Abstract:
Computational Linguistics is a branch of linguistics, which deals with the computer and linguistic levels. It is also said, as a branch of language studies which applies computer techniques to linguistics field. In Computational Linguistics, Natural Language Processing plays an important role. This came to exist because of the invention of Information Technology. In computational syntax, the syntactic analyser breaks a sentence into phrases and clauses and identifies the sentence with the syntactic information. Tamil is one of the major Dravidian languages, which has a very long written history of more than 2000 years. It is mainly spoken in Tamilnadu (in India), Srilanka, Malaysia and Singapore. It is an official language in Tamilnadu (in India), Srilanka, Malaysia and Singapore. In Malaysia Tamil speaking people are considered as an ethnic group. In Tamil syntax, the sentences in Tamil are classified into four for this research, namely: 1. Main Sentence 2. Interrogative Sentence 3. Equational Sentence 4. Elliptical Sentence. In computational syntax, the first step is to provide required information regarding the head and its constituent of each sentence. This information will be incorporated to the system using programming languages. Now the system can easily analyse a given sentence with the criteria or mechanisms given to it. Providing needful criteria or mechanisms to the computer to identify the basic types of sentences using Syntactic parser in Tamil language is the major objective of this paper.Keywords: tamil, syntax, criteria, sentences, parser
Procedia PDF Downloads 5172979 The Effect of Foundation on the Earth Fill Dam Settlement
Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh
Abstract:
Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting
Procedia PDF Downloads 1952978 A Robust Spatial Feature Extraction Method for Facial Expression Recognition
Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda
Abstract:
This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure
Procedia PDF Downloads 4252977 Polyhydroxybutyrate Production in Bacteria Isolated from Estuaries along the Eastern Coast of India
Authors: Shubhashree Mahalik, Dhanesh Kumar, Jatin Kumar Pradhan
Abstract:
Odisha is one of the coastal states situated on the eastern part of India with 480 km long coastline. The coastal Odisha is referred to as "Gift of Six Rivers". Balasore, a major coastal district of Odisha is bounded by Bay of Bengal in the East having 26 km long seashore. It is lined with several estuaries rich in biodiversity.Several studies have been carried out on the macro flora and fauna of this area but very few documented information are available regarding microbial biodiversity. In the present study, an attempt has been made to isolate and identify bacteria found along the estuaries of Balasore.Many marine microorganisms are sources of natural products which makes them potential industrial organisms. So the ability of the isolated bacteria to secrete one such industrially significant product, PHB (Polyhydroxybutyrate) has been elucidated. Several rounds of sampling, pure culture, morphological, biochemical and phylogenetic screening led to the identification of two PHB producing strains. Isolate 5 was identified to be Brevibacillus sp. and has maximum similarity to Brevibacillus parabrevis (KX83268). The isolate was named as Brevibacillus sp.KEI-5. Isolate 8 was identified asLysinibacillus sp. having closest similarity withLysinibacillus boroni-tolerance (KP314269) and named as Lysinibacillus sp. KEI-8.Media, temperature, carbon, nitrogen and salinity requirement were optimized for both isolates. Submerged fermentation of both isolates in Terrific Broth media supplemented with optimized carbon and nitrogen source at 37°C led to significant accumulation of PHB as detected by colorimetric method.Keywords: Bacillus, estuary, marine, Odisha, polyhydroxy butyrate
Procedia PDF Downloads 3492976 An Integrated Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE) Model
Authors: Babak Daneshvar Rouyendegh
Abstract:
The aim of this study is to develop and describe a new methodology for the Multi-Criteria Decision-Making (MCDM) problem using Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE) model. The proposed models enable Decision-Makers (DMs) on the assessment and use Intuitionistic Fuzzy numbers (IFN). A numerical example is provided to demonstrate and clarify the proposed analysis procedure. Also, an empirical experiment is conducted to validation the effectiveness.Keywords: Decision-Makers (DMs), Multi-Criteria Decision-Making (MCDM), Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE), Intuitionistic Fuzzy Numbers (IFN)
Procedia PDF Downloads 6782975 Isolation and Identification Fibrinolytic Protease Endophytic Fungi from Hibiscus Leaves in Shah Alam
Authors: Mohd Sidek Ahmad, Zainon Mohd Noor, Zaidah Zainal Ariffin
Abstract:
Fibrin degradation is an important part in prevention or treatment of intravascular thrombosis and cardiovascular diseases. Plasmin like fibrinolytic enzymes has given new hope to patient with cardiovascular diseases by treating fibrin aggregation related diseases with traditional plasminogen activator which have many side effects. Various researches involving wide range of sources for production of fibrinolytic proteases, from bacteria, fungi, insects and fermented foods. But few have looked into endophytic fungi as a potential source. Sixteen (16) endophytic fungi were isolated from Hibiscus sp. leaves from six different locations in Shah Alam, Selangor. Only two endophytic fungi, FH3 and S13 showed positive fibrinolytic protease activities. FH3 produced 5.78cm and S13 produced 4.48cm on Skim Milk Agar after 4 days of incubation at 27°C. Fibrinolytic activity was observed; 3.87cm and 1.82cm diameter clear zone on fibrin plate of FH3 and S13 respectively. 18srRNA was done for identification of the isolated fungi with positive fibrinolytic protease. S13 had the highest similarity (100%) to that of Penicillium citrinum strain TG2 and FH3 had the highest similarity (99%) to that of Fusarium sp. FW2PhC1, Fusarium sp. 13002, Fusarium sp. 08006, Fusarium equiseti strain Salicorn 8 and Fungal sp. FCASAn-2. Media composition variation showed the effects of carbon nitrogen on protein concentration, where the decrement of 50% of media composition caused drastic decrease in protease of FH3 from 1.081 to 0.056 and also S13 from 2.946 to 0.198.Keywords: isolation, identification, fibrinolytic protease, endophytic fungi, Hibiscus leaves
Procedia PDF Downloads 4332974 Effect of Forging Pressure on Mechanical Properties and Microstructure of Similar and Dissimilar Friction Welded Joints (Aluminium, Copper, Steel)
Authors: Sagar Pandit
Abstract:
The present work focuses on the effect of various process parameters on the mechanical properties and microstructure of joints produced by continuous drive friction welding and linear friction welding. An attempt is made to investigate the feasibility of obtaining an acceptable weld joint between similar as well as dissimilar components and the microstructural changes have also been assessed once the good weld joints were considered (using Optical Microscopy and Scanning Electron Microscopy techniques). The impact of forging pressure in the microstructure of the weld joint has been studied and the variation in joint strength with varying forge pressure is analyzed. The weld joints were obtained two pair of dissimilar materials and one pair of similar materials, which are listed respectively as: Al-AA5083 & Cu-C101 (dissimilar), Aluminium alloy-3000 series & Mild Steel (dissimilar) and High Nitrogen Austenitic Stainless Steel pair (similar). Intermetallic phase formation was observed at the weld joints in the Al-Cu joint, which consequently harmed the properties of the joint (less tensile strength). It was also concluded that the increase in forging pressure led to both increment and decrement in the tensile strength of the joint depending on the similarity or dissimilarity of the components. The hardness was also observed to possess maximum as well as minimum values at the weld joint depending on the similarity or dissimilarity of workpieces. It was also suggested that a higher forging pressure is needed to obtain complete joining for the formation of the weld joint.Keywords: forging pressure, friction welding, mechanical properties, microstructure
Procedia PDF Downloads 1182973 The Application of Modern Technologies in Urban Development
Authors: Solotan A. Tolulope
Abstract:
Due to the lack of application of laws, implementers' acquaintance with the principles of urban planning, or the absence of laws and the governmental role, cities and their urban growth developed more than the fundamental designs and plans. This has led to a lack of foundations and criteria for achieving a life that provides the needs of sufficient housing in urban planning. In this study, we attempted to use cutting-edge innovations and technology to manage and resolve issues while collaborating with planning cadres that have the potential to significantly and favorably impact urban development. This helps to enhance management's function and the effectiveness of urban planning and management. To fulfill the needs of the community and the neighborhoods of these cities, modern approaches and technologies are used, addressing the criteria of sustainability and development. To put the notion of urban sustainability and development into action, this has been researched using global experiences.Keywords: application, modern, technologies, urban, development
Procedia PDF Downloads 1102972 The Sustainable Design Approaches of Vernacular Architecture in Anatolia
Authors: Mine Tanaç Zeren
Abstract:
The traditional architectural style or the vernacular architecture can be considered modern and permanent in terms of reflecting the community’s lifestyle, reasonable interpretation of the material and the structure, and the building and the environment relationship’s integrity. When vernacular architecture is examined, it is seen that sustainable building design approaches are achieved at the very beginning by adapting to climate conditions. The aim of the sustainable design approach is to maintain to adapt to the characteristics of the topography of the land and to the climatic conditions, minimizing the energy use by the building material and structural elements. Traditional Turkish House, as one of the representatives of the traditional and vernacular architecture in Anatolia, has a sustainable building design approach as well, which can be read both from the space organization, the section, the volume, and the building components and building details. The only effective factor that human beings cannot change and have to adapt their constructions and settlements to is climate. The vernacular settlements of vernacular architecture in Anatolia, “Traditional Turkish Houses,” are generally formed as concentric settlements in desert conditions and climates or separate and dependently formations according to the wind and the sun in moist areas. They obtain the sustainable building design criteria. This paper aims to put forward the sustainable building design approaches of vernacular architecture in Anatolia. There are four main different climatic conditions depending on the regional differentiations in Anatolia. Taking these different climatic and topographic conditions into account, it has been seen that the vernacular housing features shape and differentiate from each other due to the changing conditions. What is differentiating is the space organization, design of the shelter of the building, material, and structural system used. In this paper, the sustainable building design approaches of Anatolian vernacular architecture will be examined within these four different vernacular settlements located in Aegean Region, Marmara Region, Black Sea Region, and Eastern Region. These differentiated features and how these features differentiate in order to maintain the sustainability criteria will be the main discussion part of the paper. The methodology of this paper will briefly define these differentiations and the sustainable design criteria. The sustainable design approaches and these differentiated items will be read through the design criteria of the shelter of the building and the material selection criteria according to climatic conditions. The methods of preventing energy loss will be examined. At the end of this research, it is going to be seen that the houses located in different parts of Anatolia, depending on climate and topographic conditions to be able to adapt to the environment and maintain sustainability, differ from each other in terms of space organization, structural system, and material use, design of the shelter of the buildingKeywords: sustainability of vernacular architecture, sustainable design criteria of traditional Turkish houses, Turkish houses, vernacular architecture
Procedia PDF Downloads 982971 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles
Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.Keywords: mobile mapping, GNSS, IMU, similarity, classification
Procedia PDF Downloads 842970 The Eye Tracking Technique and the Study of Some Abstract Mathematical Concepts at the University
Authors: Tamara Díaz-Chang, Elizabeth-H Arredondo
Abstract:
This article presents the results of mixed approach research, where the ocular movements of students are examined while they solve questionnaires related to some abstract mathematical concepts. The objective of this research is to determine possible correlations between the parameters of ocular activity and the level of difficulty of the tasks. The difficulty level categories were established based on two types of criteria: a subjective one, through an evaluation, carried out by the subjects, and a behavioral one, related to obtaining the correct solution. Correlations of these criteria with ocular activity parameters, which were considered indicators of mental effort, were identified. The analysis of the data obtained allowed us to observe discrepancies in the categorization of difficulty levels based on subjective and behavioral criteria. There was a negative correlation of the eye movement parameters with the students' opinions on the level of difficulty of the questions, while a strong positive and significant correlation was noted between most of the parameters of ocular activity and the level of difficulty, determined by the percentage of correct answers. The results obtained by the analysis of the data suggest that eye movement parameters can be taken as indicators of the difficulty level of the tasks related to the study of some abstract mathematical concepts at the university.Keywords: abstract mathematical concepts, cognitive neuroscience, eye-tracking, university education
Procedia PDF Downloads 1202969 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet
Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi
Abstract:
One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)
Procedia PDF Downloads 4402968 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses
Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau
Abstract:
Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.Keywords: exam length, psychometric criteria, synthetic experimental designs, test length
Procedia PDF Downloads 2722967 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System
Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek
Abstract:
This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.Keywords: data warehouse, GIS, MCDM, SOLAP
Procedia PDF Downloads 1772966 Effect of Low Level Laser Therapy versus Polarized Light Therapy on Oral Mucositis in Cancer Patients Receiving Chemotherapy
Authors: Andrew Anis Fakhrey Mosaad
Abstract:
The goal of this study is to compare the efficacy of polarised light therapy with low-intensity laser therapy in treating oral mucositis brought on by chemotherapy in cancer patients. Evaluation procedures are the measurement of the WHO oral mucositis scale and the Common toxicity criteria scale. Techniques: Cancer patients (men and women) who had oral mucositis, ulceration, and discomfort and whose ages varied from 30 to 55 years were separated into two groups and received 40 chemotherapy treatments. Twenty patients in Group (A) received low-level laser therapy (LLLT) along with their regular oral mucositis medication treatment, while twenty patients in Group (B) received Bioptron light therapy (BLT) along with their regular oral mucositis medication treatment. Both treatments were applied for 10 minutes each day for 30 days. Conclusion and results: This study showed that the use of both BLT and LLLT on oral mucositis in cancer patients following chemotherapy greatly improved, as seen by the sharp falls in both the WHO oral mucositis scale (OMS) and the common toxicity criteria scale (CTCS). However, low-intensity laser therapy (LLLT) was superior to Bioptron light therapy in terms of benefits (BLT).Keywords: Bioptron light therapy, low level laser therapy, oral mucositis, WHO oral mucositis scale, common toxicity criteria scale
Procedia PDF Downloads 2462965 Locating the Best Place for Earthquake Refugee Camps by OpenSource Software: A Case Study for Tehran, Iran
Authors: Reyhaneh Saeedi
Abstract:
Iran is one of the regions which are most prone for earthquakes annually having a large number of financial and mortality and financial losses. Every year around the world, a large number of people lose their home and life due to natural disasters such as earthquakes. It is necessary to provide and specify some suitable places for settling the homeless people before the occurrence of the earthquake, one of the most important factors in crisis planning and management. Some of the natural disasters can be Modeling and shown by Geospatial Information System (GIS). By using GIS, it would be possible to manage the spatial data and reach several goals by making use of the analyses existing in it. GIS has a determining role in disaster management because it can determine the best places for temporary resettling after such a disaster. In this research QuantumGIS software is used that It is an OpenSource software so that easy to access codes and It is also free. In this system, AHP method is used as decision model and to locate the best places for temporary resettling, is done based on the related organizations criteria with their weights and buffers. Also in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Eventually, there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in QuantumGIS platform.Keywords: disaster management, temporary resettlement, earthquake, QuantumGIS
Procedia PDF Downloads 3962964 Research of Strong-Column-Weak-Beam Criteria of Reinforced Concrete Frames Subjected to Biaxial Seismic Excitation
Authors: Chong Zhang, Mu-Xuan Tao
Abstract:
In several earthquakes, numerous reinforced concrete (RC) frames subjected to seismic excitation demonstrated a collapse pattern characterized by column hinges, though designed according to the Strong-Column-Weak-Beam (S-C-W-B) criteria. The effect of biaxial seismic excitation on the disparity between design and actual performance is carefully investigated in this article. First, a modified load contour method is proposed to derive a closed-form equation of biaxial bending moment strength, which is verified by numerical and experimental tests. Afterwards, a group of time history analyses of a simple frame modeled by fiber beam-column elements subjected to biaxial seismic excitation are conducted to verify that the current S-C-W-B criteria are not adequate to prevent the occurrence of column hinges. A biaxial over-strength factor is developed based on the proposed equation, and the reinforcement of columns is appropriately amplified with this factor to prevent the occurrence of column hinges under biaxial excitation, which is proved to be effective by another group of time history analyses.Keywords: biaxial bending moment capacity, biaxial seismic excitation, fiber beam model, load contour method, strong-column-weak-beam
Procedia PDF Downloads 992963 Public Bus Transport Passenger Safety Evaluations in Ghana: A Phenomenological Constructivist Exploration
Authors: Enoch F. Sam, Kris Brijs, Stijn Daniels, Tom Brijs, Geert Wets
Abstract:
Notwithstanding the growing body of literature that recognises the importance of personal safety to public transport (PT) users, it remains unclear what PT users consider regarding their safety. In this study, we explore the criteria PT users in Ghana use to assess bus safety. This knowledge will afford a better understanding of PT users’ risk perceptions and assessments which may contribute to theoretical models of PT risk perceptions. We utilised phenomenological research methodology, with data drawn from 61 purposively sampled participants. Data collection (through focus group discussions and in-depth interviews) and analyses were done concurrently to the point of saturation. Our inductive data coding and analyses through the constant comparison and content analytic techniques resulted in 4 code categories (conceptual dimensions), 27 codes (safety items/criteria), and 100 quotations (data segments). Of the number of safety criteria participants use to assess bus safety, vehicle condition, driver’s marital status, and transport operator’s safety records were the most considered. With each criterion, participants rightly demonstrated its respective relevance to bus safety. These findings imply that investment in and maintenance of safer vehicles, and responsible and safety-conscious drivers, and prioritization of passengers’ safety are key-targets for public bus/minibus operators in Ghana.Keywords: safety evaluations, public bus/minibus, passengers, phenomenology, Ghana
Procedia PDF Downloads 3372962 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies
Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid
Abstract:
Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance
Procedia PDF Downloads 503