Search results for: web accessible interface
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2031

Search results for: web accessible interface

1551 Investigation of Interlayer Shear Effects in Asphalt Overlay on Existing Rigid Airfield Pavement Using Digital Image Correlation

Authors: Yuechao Lei, Lei Zhang

Abstract:

The interface shear between asphalt overlay and existing rigid airport pavements occurs due to differences in the mechanical properties of materials subjected to aircraft loading. Interlayer contact influences the mechanical characteristics of the asphalt overlay directly. However, the effective interlayer relative displacement obtained accurately using existing displacement sensors of the loading apparatus remains challenging. This study aims to utilize digital image correlation technology to enhance the accuracy of interfacial contact parameters by obtaining effective interlayer relative displacements. Composite structure specimens were prepared, and fixtures for interlayer shear tests were designed and fabricated. Subsequently, a digital image recognition scheme for required markers was designed and optimized. Effective interlayer relative displacement values were obtained through image recognition and calculation of surface markers on specimens. Finite element simulations validated the mechanical response of composite specimens with interlayer shearing. Results indicated that an optimized marking approach using the wall mending agent for surface application and color coding enhanced the image recognition quality of marking points on the specimen surface. Further image extraction provided effective interlayer relative displacement values during interlayer shear, thereby improving the accuracy of interface contact parameters. For composite structure specimens utilizing Styrene-Butadiene-Styrene (SBS) modified asphalt as the tack coat, the corresponding maximum interlayer shear stress strength was 0.6 MPa, and fracture energy was 2917 J/m2. This research provides valuable insights for investigating the impact of interlayer contact in composite pavement structures on the mechanical characteristics of asphalt overlay.

Keywords: interlayer contact, effective relative displacement, digital image correlation technology, composite pavement structure, asphalt overlay

Procedia PDF Downloads 27
1550 Development of an Automatic Control System for ex vivo Heart Perfusion

Authors: Pengzhou Lu, Liming Xin, Payam Tavakoli, Zhonghua Lin, Roberto V. P. Ribeiro, Mitesh V. Badiwala

Abstract:

Ex vivo Heart Perfusion (EVHP) has been developed as an alternative strategy to expand cardiac donation by enabling resuscitation and functional assessment of hearts donated from marginal donors, which were previously not accepted. EVHP parameters, such as perfusion flow (PF) and perfusion pressure (PP) are crucial for optimal organ preservation. However, with the heart’s constant physiological changes during EVHP, such as coronary vascular resistance, manual control of these parameters is rendered imprecise and cumbersome for the operator. Additionally, low control precision and the long adjusting time may lead to irreversible damage to the myocardial tissue. To solve this problem, an automatic heart perfusion system was developed by applying a Human-Machine Interface (HMI) and a Programmable-Logic-Controller (PLC)-based circuit to control PF and PP. The PLC-based control system collects the data of PF and PP through flow probes and pressure transducers. It has two control modes: the RPM-flow mode and the pressure mode. The RPM-flow control mode is an open-loop system. It influences PF through providing and maintaining the desired speed inputted through the HMI to the centrifugal pump with a maximum error of 20 rpm. The pressure control mode is a closed-loop system where the operator selects a target Mean Arterial Pressure (MAP) to control PP. The inputs of the pressure control mode are the target MAP, received through the HMI, and the real MAP, received from the pressure transducer. A PID algorithm is applied to maintain the real MAP at the target value with a maximum error of 1mmHg. The precision and control speed of the RPM-flow control mode were examined by comparing the PLC-based system to an experienced operator (EO) across seven RPM adjustment ranges (500, 1000, 2000 and random RPM changes; 8 trials per range) tested in a random order. System’s PID algorithm performance in pressure control was assessed during 10 EVHP experiments using porcine hearts. Precision was examined through monitoring the steady-state pressure error throughout perfusion period, and stabilizing speed was tested by performing two MAP adjustment changes (4 trials per change) of 15 and 20mmHg. A total of 56 trials were performed to validate the RPM-flow control mode. Overall, the PLC-based system demonstrated the significantly faster speed than the EO in all trials (PLC 1.21±0.03, EO 3.69±0.23 seconds; p < 0.001) and greater precision to reach the desired RPM (PLC 10±0.7, EO 33±2.7 mean RPM error; p < 0.001). Regarding pressure control, the PLC-based system has the median precision of ±1mmHg error and the median stabilizing times in changing 15 and 20mmHg of MAP are 15 and 19.5 seconds respectively. The novel PLC-based control system was 3 times faster with 60% less error than the EO for RPM-flow control. In pressure control mode, it demonstrates a high precision and fast stabilizing speed. In summary, this novel system successfully controlled perfusion flow and pressure with high precision, stability and a fast response time through a user-friendly interface. This design may provide a viable technique for future development of novel heart preservation and assessment strategies during EVHP.

Keywords: automatic control system, biomedical engineering, ex-vivo heart perfusion, human-machine interface, programmable logic controller

Procedia PDF Downloads 146
1549 The Boundary Element Method in Excel for Teaching Vector Calculus and Simulation

Authors: Stephen Kirkup

Abstract:

This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.

Keywords: boundary element method, Laplace’s equation, vector calculus, simulation, education

Procedia PDF Downloads 138
1548 Natural Language Processing; the Future of Clinical Record Management

Authors: Khaled M. Alhawiti

Abstract:

This paper investigates the future of medicine and the use of Natural language processing. The importance of having correct clinical information available online is remarkable; improving patient care at affordable costs could be achieved using automated applications to use the online clinical information. The major challenge towards the retrieval of such vital information is to have it appropriately coded. Majority of the online patient reports are not found to be coded and not accessible as its recorded in natural language text. The use of Natural Language processing provides a feasible solution by retrieving and organizing clinical information, available in text and transforming clinical data that is available for use. Systems used in NLP are rather complex to construct, as they entail considerable knowledge, however significant development has been made. Newly formed NLP systems have been tested and have established performance that is promising and considered as practical clinical applications.

Keywords: clinical information, information retrieval, natural language processing, automated applications

Procedia PDF Downloads 381
1547 Heterogenous Dimensional Super Resolution of 3D CT Scans Using Transformers

Authors: Helen Zhang

Abstract:

Accurate segmentation of the airways from CT scans is crucial for early diagnosis of lung cancer. However, the existing airway segmentation algorithms often rely on thin-slice CT scans, which can be inconvenient and costly. This paper presents a set of machine learning-based 3D super-resolution algorithms along heterogeneous dimensions to improve the resolution of thicker CT scans to reduce the reliance on thin-slice scans. To evaluate the efficacy of the super-resolution algorithms, quantitative assessments using PSNR (Peak Signal to Noise Ratio) and SSIM (Structural SIMilarity index) were performed. The impact of super-resolution on airway segmentation accuracy is also studied. The proposed approach has the potential to make airway segmentation more accessible and affordable, thereby facilitating early diagnosis and treatment of lung cancer.

Keywords: 3D super-resolution, airway segmentation, thin-slice CT scans, machine learning

Procedia PDF Downloads 82
1546 An Analytical Study of the Quality of Educational Administration and Management At Secondary School Level in Punjab, Pakistan

Authors: Shamim Akhtar

Abstract:

The purpose of the present research was to analyse the performance level of district administrators and school heads teachers at secondary school level. The sample of the study was head teachers and teachers of secondary schools. In survey three scales were used, two scales were for the head teachers, one five point scale was for analysing the working efficiency of educational administrators and other seven points scale was for head teachers for analysing their own performance and one another seven point rating scale similar to head teacher was for the teachers for analysing the working performance of their head teachers. The results of the head teachers’ responses revealed that the performance of their District Educational Administrators was average and for the performance efficiency of the head teachers, researcher constructed the rating scales on seven parameters of management likely academic management, personnel management, financial management, infra-structure management, linkage and interface, student’s services, and managerial excellence. Results of percentages, means, and graphical presentation on different parameters of management showed that there was an obvious difference in head teachers and teachers’ responses and head teachers probably were overestimating their efficiency; but teachers evaluated that they were performing averagely on majority statements. Results of t-test showed that there was no significance difference in the responses of rural and urban teachers but significant difference in male and female teachers’ responses showed that female head teachers were performing their responsibilities better than male head teachers in public sector schools. When efficiency of the head teachers on different parameters of management were analysed it was concluded that their efficiency on academic and personnel management was average and on financial management and on managerial excellence was highly above of average level but on others parameters like infra-structure management, linkage and interface and on students services was above of average level on most statements but highly above of average on some statements. Hence there is need to improve the working efficiency in academic management and personnel management.

Keywords: educational administration, educational management, parameters of management, education

Procedia PDF Downloads 312
1545 The Pore–Scale Darcy–Brinkman–Stokes Model for the Description of Advection–Diffusion–Precipitation Using Level Set Method

Authors: Jiahui You, Kyung Jae Lee

Abstract:

Hydraulic fracturing fluid (HFF) is widely used in shale reservoir productions. HFF contains diverse chemical additives, which result in the dissolution and precipitation of minerals through multiple chemical reactions. In this study, a new pore-scale Darcy–Brinkman–Stokes (DBS) model coupled with Level Set Method (LSM) is developed to address the microscopic phenomena occurring during the iron–HFF interaction, by numerically describing mass transport, chemical reactions, and pore structure evolution. The new model is developed based on OpenFOAM, which is an open-source platform for computational fluid dynamics. Here, the DBS momentum equation is used to solve for velocity by accounting for the fluid-solid mass transfer; an advection-diffusion equation is used to compute the distribution of injected HFF and iron. The reaction–induced pore evolution is captured by applying the LSM, where the solid-liquid interface is updated by solving the level set distance function and reinitialized to a signed distance function. Then, a smoothened Heaviside function gives a smoothed solid-liquid interface over a narrow band with a fixed thickness. The stated equations are discretized by the finite volume method, while the re-initialized equation is discretized by the central difference method. Gauss linear upwind scheme is used to solve the level set distance function, and the Pressure–Implicit with Splitting of Operators (PISO) method is used to solve the momentum equation. The numerical result is compared with 1–D analytical solution of fluid-solid interface for reaction-diffusion problems. Sensitivity analysis is conducted with various Damkohler number (DaII) and Peclet number (Pe). We categorize the Fe (III) precipitation into three patterns as a function of DaII and Pe: symmetrical smoothed growth, unsymmetrical growth, and dendritic growth. Pe and DaII significantly affect the location of precipitation, which is critical in determining the injection parameters of hydraulic fracturing. When DaII<1, the precipitation uniformly occurs on the solid surface both in upstream and downstream directions. When DaII>1, the precipitation mainly occurs on the solid surface in an upstream direction. When Pe>1, Fe (II) transported deeply into and precipitated inside the pores. When Pe<1, the precipitation of Fe (III) occurs mainly on the solid surface in an upstream direction, and they are easily precipitated inside the small pore structures. The porosity–permeability relationship is subsequently presented. This pore-scale model allows high confidence in the description of Fe (II) dissolution, transport, and Fe (III) precipitation. The model shows fast convergence and requires a low computational load. The results can provide reliable guidance for injecting HFF in shale reservoirs to avoid clogging and wellbore pollution. Understanding Fe (III) precipitation, and Fe (II) release and transport behaviors give rise to a highly efficient hydraulic fracture project.

Keywords: reactive-transport , Shale, Kerogen, precipitation

Procedia PDF Downloads 146
1544 Comparative Study Performance of the Induction Motor between SMC and NLC Modes Control

Authors: A. Oukaci, R. Toufouti, D. Dib, l. Atarsia

Abstract:

This article presents a multitude of alternative techniques to control the vector control, namely the nonlinear control and sliding mode control. Moreover, the implementation of their control law applied to the high-performance to the induction motor with the objective to improve the tracking control, ensure stability robustness to parameter variations and disturbance rejection. Tests are performed numerical simulations in the Matlab/Simulink interface, the results demonstrate the efficiency and dynamic performance of the proposed strategy.

Keywords: Induction Motor (IM), Non-linear Control (NLC), Sliding Mode Control (SMC), nonlinear sliding surface

Procedia PDF Downloads 548
1543 Spatial Organization of Cells over the Process of Pellicle Formation by Pseudomonas alkylphenolica KL28

Authors: Kyoung Lee

Abstract:

Numerous aerobic bacteria have the ability to form multicellular communities on the surface layer of the air-liquid (A-L) interface as a biofilm called a pellicle. Pellicles occupied at the A-L interface will benefit from the utilization of oxygen from air and nutrient from liquid. Buoyancy of cells can be obtained by high surface tension at the A-L interface. Thus, formation of pellicles is an adaptive advantage in utilization of excess nutrients in the standing culture where oxygen depletion is easily set up due to rapid cell growth. In natural environments, pellicles are commonly observed on the surface of lake or pond contaminated with pollutants. Previously, we have shown that when cultured in standing LB media an alkylphenol-degrading bacteria Pseudomonas alkylphenolia KL28 forms pellicles in a diameter of 0.3-0.5 mm with a thickness of ca 40 µm. The pellicles have unique features for possessing flatness and unusual rigidity. In this study, the biogenesis of the circular pellicles has been investigated by observing the cell organization at early stages of pellicle formation and cell arrangements in pellicle, providing a clue for highly organized cellular arrangement to be adapted to the air-liquid niche. Here, we first monitored developmental patterns of pellicle from monolayer to multicellular organization. Pellicles were shaped by controlled growth of constituent cells which accumulate extracellular polymeric substance. The initial two-dimensional growth was transited to multilayers by a constraint force of accumulated self-produced extracellular polymeric substance. Experiments showed that pellicles are formed by clonal growth and even with knock-out of genes for flagella and pilus formation. In contrast, the mutants in the epm gene cluster for alginate-like polymer biosynthesis were incompetent in cell alignment for initial two-dimensional growth of pellicles. Electron microscopic and confocal laser scanning microscopic studies showed that the fully matured structures are highly packed by matrix-encased cells which have special arrangements. The cells on the surface of the pellicle lie relatively flat and inside longitudinally cross packed. HPLC analysis of the extrapolysaccharide (EPS) hydrolysate from the colonies from LB agar showed a composition with L-fucose, L-rhamnose, D-galactosamine, D-glucosamine, D-galactose, D-glucose, D-mannose. However, that from pellicles showed similar neutral and amino sugar profile but missing galactose. Furthermore, uronic acid analysis of EPS hydrolysates by HPLC showed that mannuronic acid was detected from pellicles not from colonies, indicating the epm-derived polymer is critical for pellicle formation as proved by the epm mutants. This study verified that for the circular pellicle architecture P. alkylphenolica KL28 cells utilized EPS building blocks different from that used for colony construction. These results indicate that P. alkylphenolica KL28 is a clever architect that dictates unique cell arrangements with selected EPS matrix material to construct sophisticated building, circular biofilm pellicles.

Keywords: biofilm, matrix, pellicle, pseudomonas

Procedia PDF Downloads 136
1542 Science and Monitoring Underpinning River Restoration: A Case Study

Authors: Geoffrey Gilfillan, Peter Barham, Lisa Smallwood, David Harper

Abstract:

The ‘Welland for People and Wildlife’ project aimed to improve the River Welland’s ecology and water quality, and to make it more accessible to the community of Market Harborough. A joint monitoring project by the Welland Rivers Trust & University of Leicester was incorporated into the design. The techniques that have been used to measure its success are hydrological, geomorphological, and water quality monitoring, species and habitat surveys, and community engagement. Early results show improvements to flow and habitat diversity, water quality and biodiversity of the river environment. Barrier removal has increased stickleback mating activity, and decreased parasitically infected fish in sample catches. The habitats provided by the berms now boast over 25 native plant species, and the river is clearer, cleaner and with better-oxygenated water.

Keywords: community engagement, ecological monitoring, river restoration, water quality

Procedia PDF Downloads 207
1541 Statistic Regression and Open Data Approach for Identifying Economic Indicators That Influence e-Commerce

Authors: Apollinaire Barme, Simon Tamayo, Arthur Gaudron

Abstract:

This paper presents a statistical approach to identify explanatory variables linearly related to e-commerce sales. The proposed methodology allows specifying a regression model in order to quantify the relevance between openly available data (economic and demographic) and national e-commerce sales. The proposed methodology consists in collecting data, preselecting input variables, performing regressions for choosing variables and models, testing and validating. The usefulness of the proposed approach is twofold: on the one hand, it allows identifying the variables that influence e- commerce sales with an accessible approach. And on the other hand, it can be used to model future sales from the input variables. Results show that e-commerce is linearly dependent on 11 economic and demographic indicators.

Keywords: e-commerce, statistical modeling, regression, empirical research

Procedia PDF Downloads 198
1540 Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data

Authors: Fan Gao, Lior Pachter

Abstract:

The primary tool currently used to pre-process 10X Chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices, and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.

Keywords: single-cell, ATAC-seq, bioinformatics, open chromatin landscape, chromatin interactome

Procedia PDF Downloads 132
1539 Climate Impact-Minimizing Road Infrastructure Layout for Growing Cities

Authors: Stanislovas Buteliauskas, Aušrius Juozapavičius

Abstract:

City road transport contributes significantly to climate change, and the ongoing world urbanization is only increasing the problem. The paper describes a city planning concept minimizing the number of vehicles on the roads while increasing overall mobility. This becomes possible by utilizing a recently invented two-level road junction with a unique property of serving both as an intersection of uninterrupted traffic and an easily accessible transport hub capable of accumulating private vehicles, and therefore becoming an especially effective park-and-ride solution, and a logistics or business center. Optimized layouts of city road infrastructure, living and work areas, and major roads are presented. The layouts are suitable both for the development of new cities as well as for the expansion of existing ones. Costs of the infrastructure and a positive impact on climate are evaluated in comparison to current city growth patterns.

Keywords: congestion, city infrastructure, park-and-ride, road junctions

Procedia PDF Downloads 284
1538 The Power of Words: A Corpus Analysis of Campaign Speeches of President Donald J. Trump

Authors: Aiza Dalman

Abstract:

Words are powerful when these are used wisely and strategically. In this study, twelve (12) campaign speeches of President Donald J. Trump were analyzed as to frequently used words and ethos, pathos and logos being employed. The speeches were read thoroughly, analyzed and interpreted. With the use of Word Counter Tool and Text Analyzer software accessible online, it was found out that the word ‘will’ has the highest frequency of 121, followed by Hillary (58), American (38), going (35), plan and Clinton (32), illegal (30), government (28), corruption (26) and criminal (24). When the speeches were analyzed as to ethos, pathos and logos, on the other hand, it revealed that these were all employed in his speeches. The statements under these pointed out against Hillary or in his favor. The unique strategy of President Donald J. Trump as to frequently used words and ethos, pathos and logos in persuading people perhaps lead the way to his victory.

Keywords: campaign speeches, corpus analysis, ethos, logos and pathos, power of words

Procedia PDF Downloads 253
1537 Multiphase Flow Regime Detection Algorithm for Gas-Liquid Interface Using Ultrasonic Pulse-Echo Technique

Authors: Serkan Solmaz, Jean-Baptiste Gouriet, Nicolas Van de Wyer, Christophe Schram

Abstract:

Efficiency of the cooling process for cryogenic propellant boiling in engine cooling channels on space applications is relentlessly affected by the phase change occurs during the boiling. The effectiveness of the cooling process strongly pertains to the type of the boiling regime such as nucleate and film. Geometric constraints like a non-transparent cooling channel unable to use any of visualization methods. The ultrasonic (US) technique as a non-destructive method (NDT) has therefore been applied almost in every engineering field for different purposes. Basically, the discontinuities emerge between mediums like boundaries among different phases. The sound wave emitted by the US transducer is both transmitted and reflected through a gas-liquid interface which makes able to detect different phases. Due to the thermal and structural concerns, it is impractical to sustain a direct contact between the US transducer and working fluid. Hence the transducer should be located outside of the cooling channel which results in additional interfaces and creates ambiguities on the applicability of the present method. In this work, an exploratory research is prompted so as to determine detection ability and applicability of the US technique on the cryogenic boiling process for a cooling cycle where the US transducer is taken place outside of the channel. Boiling of the cryogenics is a complex phenomenon which mainly brings several hindrances for experimental protocol because of thermal properties. Thus substitute materials are purposefully selected based on such parameters to simplify experiments. Aside from that, nucleate and film boiling regimes emerging during the boiling process are simply simulated using non-deformable stainless steel balls, air-bubble injection apparatuses and air clearances instead of conducting a real-time boiling process. A versatile detection algorithm is perennially developed concerning exploratory studies afterward. According to the algorithm developed, the phases can be distinguished 99% as no-phase, air-bubble, and air-film presences. The results show the detection ability and applicability of the US technique for an exploratory purpose.

Keywords: Ultrasound, ultrasonic, multiphase flow, boiling, cryogenics, detection algorithm

Procedia PDF Downloads 146
1536 Design and Simulation Interface Circuit for Piezoresistive Accelerometers with Offset Cancellation Ability

Authors: Mohsen Bagheri, Ahmad Afifi

Abstract:

This paper presents a new method for read out of the piezoresistive accelerometer sensors. The circuit works based on instrumentation amplifier and it is useful for reducing offset in Wheatstone bridge. The obtained gain is 645 with 1 μv/°c equivalent drift and 1.58 mw power consumption. A Schmitt trigger and multiplexer circuit control output node. A high speed counter is designed in this work. The proposed circuit is designed and simulated in 0.18 μm CMOS technology with 1.8 v power supply.

Keywords: piezoresistive accelerometer, zero offset, Schmitt trigger, bidirectional reversible counter

Procedia PDF Downloads 274
1535 Adsorption and Desorption Behavior of Ionic and Nonionic Surfactants on Polymer Surfaces

Authors: Giulia Magi Meconi, Nicholas Ballard, José M. Asua, Ronen Zangi

Abstract:

Experimental and computational studies are combined to elucidate the adsorption proprieties of ionic and nonionic surfactants on hydrophobic polymer surface such us poly(styrene). To present these two types of surfactants, sodium dodecyl sulfate and poly(ethylene glycol)-block-poly(ethylene), commonly utilized in emulsion polymerization, are chosen. By applying quartz crystal microbalance with dissipation monitoring it is found that, at low surfactant concentrations, it is easier to desorb (as measured by rate) ionic surfactants than nonionic surfactants. From molecular dynamics simulations, the effective, attractive force of these nonionic surfactants to the surface increases with the decrease of their concentration, whereas, the ionic surfactant exhibits mildly the opposite trend. The contrasting behavior of ionic and nonionic surfactants critically relies on two observations obtained from the simulations. The first is that there is a large degree of interweavement between head and tails groups in the adsorbed layer formed by the nonionic surfactant (PEO/PE systems). The second is that water molecules penetrate this layer. In the disordered layer, these nonionic surfactants generate at the surface, only oxygens of the head groups present at the interface with the water phase or oxygens next to the penetrating waters can form hydrogen bonds. Oxygens inside this layer lose this favorable energy, with a magnitude that increases with the surfactants density at the interface. This reduced stability of the surfactants diminishes their driving force for adsorption. All that is shown to be in accordance with experimental results on the dynamics of surfactants desorption. Ionic surfactants assemble into an ordered structure and the attraction to the surface was even slightly augmented at higher surfactant concentration, in agreement with the experimentally determined adsorption isotherm. The reason these two types of surfactants behave differently is because the ionic surfactant has a small head group that is strongly hydrophilic, whereas the head groups of the nonionic surfactants are large and only weakly attracted to water.

Keywords: emulsion polymerization process, molecular dynamics simulations, polymer surface, surfactants adsorption

Procedia PDF Downloads 318
1534 Idea, Creativity, Design, and Ultimately, Playing with Mathematics

Authors: Yasaman Azarmjoo

Abstract:

Since ancient times, it has been said that mathematics is the mother of all sciences and the foundation of basic concepts in every field and profession. It would be great if, after learning this subject, we could enable students to create games and activities based on the same mathematical concepts. This article explores the design of various mathematical activities in the form of games, utilizing different mathematical topics such as algebra, equations, binary systems, and one-to-one correspondence. The theoretical significance of this article lies in uncovering alternative approaches to teaching and learning mathematics. By employing creative and interactive methods such as game design, it challenges the traditional perception of mathematics as a difficult and laborious subject. The theoretical significance of this article lies in demonstrating that mathematics can be made more accessible and enjoyable, which can result in heightened interest and engagement in the subject. In general, this article reveals another aspect of mathematics.

Keywords: playing with mathematics, algebra and equations, binary systems, one-to-one correspondence

Procedia PDF Downloads 54
1533 Plasma-Induced Modification of Biomolecules: A Tool for Analysis of Protein Structures

Authors: Yuting Wu, Faraz Choudhury, Daniel Benjamin, James Whalin, Joshua Blatz, Leon Shohet, Michael Sussman, Mark Richards

Abstract:

Plasma-Induced Modification of Biomolecules (PLIMB) has been developed as a technology, which, together with mass spectrometry, measures three-dimensional structural characteristics of proteins. This technique uses hydroxyl radicals generated by atmospheric-pressure plasma discharge to react with the solvent-accessible side chains of protein in an aqueous solution. In this work, we investigate the three-dimensional structure of hemoglobin and myoglobin using PLIMB. Additional modifications to these proteins, such as oxidation, fragmentations, and conformational changes caused by PLIMB are also explored. These results show that PLIMB, coupled with mass spectrometry, is an effective way to determine solvent access to hemoproteins. Furthermore, we show that many factors, including pH and the electrical parameters used to generate the plasma, have a significant influence on solvent accessibility.

Keywords: plasma, hemoglobin, myoglobin, solvent access

Procedia PDF Downloads 162
1532 Context Specific E-Transformation Decision-Making Framework

Authors: A. Hol

Abstract:

Nowadays, within quickly changing business environments, companies are often faced with specific problems where knowledge required to make timely decisions is often available however is not always readily accessible by the decision makers, in a required form. To identify if in any way via innovative system development companies could be assisted so that they can make quicker industry specific decisions in a given time and space, researchers conducted in depth case study investigation during which they studied company’s e-transformation recommendations, company’s current issues and problems as well as the nature of company’s pressing decisions. This study utilizes Scenario Based Analysis with the aim to help identify parameters crucial for the development of the system that could support decision making in a given time and space. Based on the findings, Context Specific e-transformation decision making framework is proposed.

Keywords: e-transformation, business context, decision making, e-T Guide, ICT

Procedia PDF Downloads 431
1531 Impact of Displacements Durations and Monetary Costs on the Labour Market within a City Consisting on Four Areas a Theoretical Approach

Authors: Aboulkacem El Mehdi

Abstract:

We develop a theoretical model at the crossroads of labour and urban economics, used for explaining the mechanism through which the duration of home-workplace trips and their monetary costs impact the labour demand and supply in a spatially scattered labour market and how they are impacted by a change in passenger transport infrastructures and services. The spatial disconnection between home and job opportunities is referred to as the spatial mismatch hypothesis (SMH). Its harmful impact on employment has been subject to numerous theoretical propositions. However, all the theoretical models proposed so far are patterned around the American context, which is particular as it is marked by racial discrimination against blacks in the housing and the labour markets. Therefore, it is only natural that most of these models are developed in order to reproduce a steady state characterized by agents carrying out their economic activities in a mono-centric city in which most unskilled jobs being created in the suburbs, far from the Blacks who dwell in the city-centre, generating a high unemployment rates for blacks, while the White population resides in the suburbs and has a low unemployment rate. Our model doesn't rely on any racial discrimination and doesn't aim at reproducing a steady state in which these stylized facts are replicated; it takes the main principle of the SMH -the spatial disconnection between homes and workplaces- as a starting point. One of the innovative aspects of the model consists in dealing with a SMH related issue at an aggregate level. We link the parameters of the passengers transport system to employment in the whole area of a city. We consider here a city that consists of four areas: two of them are residential areas with unemployed workers, the other two host firms looking for labour force. The workers compare the indirect utility of working in each area with the utility of unemployment and choose between submitting an application for the job that generate the highest indirect utility or not submitting. This arbitration takes account of the monetary and the time expenditures generated by the trips between the residency areas and the working areas. Each of these expenditures is clearly and explicitly formulated so that the impact of each of them can be studied separately than the impact of the other. The first findings show that the unemployed workers living in an area benefiting from good transport infrastructures and services have a better chance to prefer activity to unemployment and are more likely to supply a higher 'quantity' of labour than those who live in an area where the transport infrastructures and services are poorer. We also show that the firms located in the most accessible area receive much more applications and are more likely to hire the workers who provide the highest quantity of labour than the firms located in the less accessible area. Currently, we are working on the matching process between firms and job seekers and on how the equilibrium between the labour demand and supply occurs.

Keywords: labour market, passenger transport infrastructure, spatial mismatch hypothesis, urban economics

Procedia PDF Downloads 262
1530 Calm, Confusing and Chaotic: Investigating Humanness through Sentiment Analysis of Abstract Artworks

Authors: Enya Autumn Trenholm-Jensen, Hjalte Hviid Mikkelsen

Abstract:

This study was done in the pursuit of nuancing the discussion surrounding what it means to be human in a time of unparalleled technological development. Subjectivity was deemed to be an accessible example of humanity to study, and art was a fitting medium through which to probe subjectivity. Upon careful theoretical consideration, abstract art was found to fit the parameters of the study with the added bonus of being, as of yet, uninterpretable from an AI perspective. It was hypothesised that dissimilar appraisals of the art stimuli would be found through sentiment and terminology. Opinion data was collected through survey responses and analysed using Valence Aware Dictionary for sEntiment Reasoning (VADER) sentiment analysis. The results reflected the enigmatic nature of subjectivity through erratic ratings of the art stimuli. However, significant themes were found in the terminology used in the responses. The implications of the findings are discussed in relation to the uniqueness, or lack thereof, of human subjectivity, and directions for future research are provided.

Keywords: abstract art, artificial intelligence, cognition, sentiment, subjectivity

Procedia PDF Downloads 100
1529 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development

Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas

Abstract:

One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.

Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development

Procedia PDF Downloads 294
1528 Open Data for e-Governance: Case Study of Bangladesh

Authors: Sami Kabir, Sadek Hossain Khoka

Abstract:

Open Government Data (OGD) refers to all data produced by government which are accessible in reusable way by common people with access to Internet and at free of cost. In line with “Digital Bangladesh” vision of Bangladesh government, the concept of open data has been gaining momentum in the country. Opening all government data in digital and customizable format from single platform can enhance e-governance which will make government more transparent to the people. This paper presents a well-in-progress case study on OGD portal by Bangladesh Government in order to link decentralized data. The initiative is intended to facilitate e-service towards citizens through this one-stop web portal. The paper further discusses ways of collecting data in digital format from relevant agencies with a view to making it publicly available through this single point of access. Further, possible layout of this web portal is presented.

Keywords: e-governance, one-stop web portal, open government data, reusable data, web of data

Procedia PDF Downloads 327
1527 The Doctrine of Military Necessity under Customary International Law: A Breach of International Humanitarian Law

Authors: Uche A. Nnawulezi

Abstract:

This paper examines an essential and complex part of International humanitarian law standards of military necessity. Military necessity is an unpredictable phenomenon. The unpredictability of this regulation likewise originates from the fact that is one of the most fundamental, yet most misjudged and distorted standards of international law of armed conflict. This rule has been censured as essentially wrong in light of its non-compliance with the principles of international humanitarian law in recent past. The author noted in this study that military necessity runs counter to humanitarian exigencies. These have generated debate among researchers for them to propose that for international law to be considered more important, it is indispensable that the procedures and substance of custom be illuminated and made accessible to every one of the individuals who may utilize it or be influenced by it. However, a significant number of analysts have attributed particular weaknesses to this doctrine. This study relied on both primary and secondary sources of data collection. Significantly, the recommendation made in this paper, if completely adopted, shall go a long way in guaranteeing a better application of the principles of international humanitarian law.

Keywords: military necessity, international law, international humanitarian law, customary law

Procedia PDF Downloads 188
1526 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 218
1525 National Strategy for Swedish Wildlife Management

Authors: Maria Hornell, Marcus Ohman

Abstract:

Nature, and the society it is a part of, is under constant change. The landscape, climate and game populations vary over time, as well as society's priorities and the way it uses the land where wildlife may proliferate. Sweden currently has historically large wildlife populations which are a resource for the benefit and joy of many people. Wildlife may also be seen as a problem as it may cause damage in contradiction to other human interests. The Swedish Environmental Protection Agency introduces a new long-term strategy for national wildlife management. The strategy envisions a wildlife management in balance. It focuses on wildlife values in a broad sense including outdoor recreation and tourism as well as conservation of biodiversity. It is fundamental that these values should be open and accessible for the major part of the population. For that to be possible new ways to manage, mitigate and prevent damages and other problems that wildlife causes need to be developed. The strategy describes a roadmap for the development and strengthening of Sweden's wildlife management until 2020. It aims at being applicable for those authorities and stakeholders with interest in wildlife management being a guide for their own strategies, goals, and activities.

Keywords: wildlife management, strategy, Sweden, SEPA

Procedia PDF Downloads 192
1524 Environmental Impact of a New-Build Educational Building in England: Life-Cycle Assessment as a Method to Calculate Whole Life Carbon Emissions

Authors: Monkiz Khasreen

Abstract:

In the context of the global trend towards reducing new buildings carbon footprint, the design team is required to make early decisions that have a major influence on embodied and operational carbon. Sustainability strategies should be clear during early stages of building design process, as changes made later can be extremely costly. Life-Cycle Assessment (LCA) could be used as the vehicle to carry other tools and processes towards achieving the requested improvement. Although LCA is the ‘golden standard’ to evaluate buildings from 'cradle to grave', lack of details available on the concept design makes LCA very difficult, if not impossible, to be used as an estimation tool at early stages. Issues related to transparency and accessibility of information in the building industry are affecting the credibility of LCA studies. A verified database derived from LCA case studies is required to be accessible to researchers, design professionals, and decision makers in order to offer guidance on specific areas of significant impact. This database could be the build-up of data from multiple sources within a pool of research held in this context. One of the most important factors that affects the reliability of such data is the temporal factor as building materials, components, and systems are rapidly changing with the advancement of technology making production more efficient and less environmentally harmful. Recent LCA studies on different building functions, types, and structures are always needed to update databases derived from research and to form case bases for comparison studies. There is also a need to make these studies transparent and accessible to designers. The work in this paper sets out to address this need. This paper also presents life-cycle case study of a new-build educational building in England. The building utilised very current construction methods and technologies and is rated as BREEAM excellent. Carbon emissions of different life-cycle stages and different building materials and components were modelled. Scenario and sensitivity analyses were used to estimate the future of new educational buildings in England. The study attempts to form an indicator during the early design stages of similar buildings. Carbon dioxide emissions of this case study building, when normalised according to floor area, lie towards the lower end of the range of worldwide data reported in the literature. Sensitivity analysis shows that life cycle assessment results are highly sensitive to future assumptions made at the design stage, such as future changes in electricity generation structure over time, refurbishment processes and recycling. The analyses also prove that large savings in carbon dioxide emissions can result from very small changes at the design stage.

Keywords: architecture, building, carbon dioxide, construction, educational buildings, England, environmental impact, life-cycle assessment

Procedia PDF Downloads 96
1523 Geographic Information System Based Multi-Criteria Subsea Pipeline Route Optimisation

Authors: James Brown, Stella Kortekaas, Ian Finnie, George Zhang, Christine Devine, Neil Healy

Abstract:

The use of GIS as an analysis tool for engineering decision making is now best practice in the offshore industry. GIS enables multidisciplinary data integration, analysis and visualisation which allows the presentation of large and intricate datasets in a simple map-interface accessible to all project stakeholders. Presenting integrated geoscience and geotechnical data in GIS enables decision makers to be well-informed. This paper is a successful case study of how GIS spatial analysis techniques were applied to help select the most favourable pipeline route. Routing a pipeline through any natural environment has numerous obstacles, whether they be topographical, geological, engineering or financial. Where the pipeline is subjected to external hydrostatic water pressure and is carrying pressurised hydrocarbons, the requirement to safely route the pipeline through hazardous terrain becomes absolutely paramount. This study illustrates how the application of modern, GIS-based pipeline routing techniques enabled the identification of a single most-favourable pipeline route crossing of a challenging seabed terrain. Conventional approaches to pipeline route determination focus on manual avoidance of primary constraints whilst endeavouring to minimise route length. Such an approach is qualitative, subjective and is liable to bias towards the discipline and expertise that is involved in the routing process. For very short routes traversing benign seabed topography in shallow water this approach may be sufficient, but for deepwater geohazardous sites, the need for an automated, multi-criteria, and quantitative approach is essential. This study combined multiple routing constraints using modern least-cost-routing algorithms deployed in GIS, hitherto unachievable with conventional approaches. The least-cost-routing procedure begins with the assignment of geocost across the study area. Geocost is defined as a numerical penalty score representing hazard posed by each routing constraint (e.g. slope angle, rugosity, vulnerability to debris flows) to the pipeline. All geocosted routing constraints are combined to generate a composite geocost map that is used to compute the least geocost route between two defined terminals. The analyses were applied to select the most favourable pipeline route for a potential gas development in deep water. The study area is geologically complex with a series of incised, potentially active, canyons carved into a steep escarpment, with evidence of extensive debris flows. A similar debris flow in the future could cause significant damage to a poorly-placed pipeline. Protruding inter-canyon spurs offer lower-gradient options for ascending an escarpment but the vulnerability of periodic failure of these spurs is not well understood. Close collaboration between geoscientists, pipeline engineers, geotechnical engineers and of course the gas export pipeline operator guided the analyses and assignment of geocosts. Shorter route length, less severe slope angles, and geohazard avoidance were the primary drivers in identifying the most favourable route.

Keywords: geocost, geohazard, pipeline route determination, pipeline route optimisation, spatial analysis

Procedia PDF Downloads 373
1522 Comparison of the Chest X-Ray and Computerized Tomography Scans Requested from the Emergency Department

Authors: Sahabettin Mete, Abdullah C. Hocagil, Hilal Hocagil, Volkan Ulker, Hasan C. Taskin

Abstract:

Objectives and Goals: An emergency department is a place where people can come for a multitude of reasons 24 hours a day. As it is an easy, accessible place, thanks to self-sacrificing people who work in emergency departments. But the workload and overcrowding of emergency departments are increasing day by day. Under these circumstances, it is important to choose a quick, easily accessible and effective test for diagnosis. This results in laboratory and imaging tests being more than 40% of all emergency department costs. Despite all of the technological advances in imaging methods and available computerized tomography (CT), chest X-ray, the older imaging method, has not lost its appeal and effectiveness for nearly all emergency physicians. Progress in imaging methods are very convenient, but physicians should consider the radiation dose, cost, and effectiveness, as well as imaging methods to be carefully selected and used. The aim of the study was to investigate the effectiveness of chest X-ray in immediate diagnosis against the advancing technology by comparing chest X-ray and chest CT scan results of the patients in the emergency department. Methods: Patients who applied to Bulent Ecevit University Faculty of Medicine’s emergency department were investigated retrospectively in between 1 September 2014 and 28 February 2015. Data were obtained via MIAMED (Clear Canvas Image Server v6.2, Toronto, Canada), information management system which patients’ files are saved electronically in the clinic, and were retrospectively scanned. The study included 199 patients who were 18 or older, had both chest X-ray and chest CT imaging. Chest X-ray images were evaluated by the emergency medicine senior assistant in the emergency department, and the findings were saved to the study form. CT findings were obtained from already reported data by radiology department in the clinic. Chest X-ray was evaluated with seven questions in terms of technique and dose adequacy. Patients’ age, gender, application complaints, comorbid diseases, vital signs, physical examination findings, diagnosis, chest X-ray findings and chest CT findings were evaluated. Data saved and statistical analyses have made via using SPSS 19.0 for Windows. And the value of p < 0.05 were accepted statistically significant. Results: 199 patients were included in the study. In 38,2% (n=76) of all patients were diagnosed with pneumonia and it was the most common diagnosis. The chest X-ray imaging technique was appropriate in patients with the rate of 31% (n=62) of all patients. There was not any statistically significant difference (p > 0.05) between both imaging methods (chest X-ray and chest CT) in terms of determining the rates of displacement of the trachea, pneumothorax, parenchymal consolidation, increased cardiothoracic ratio, lymphadenopathy, diaphragmatic hernia, free air levels in the abdomen (in sections including the image), pleural thickening, parenchymal cyst, parenchymal mass, parenchymal cavity, parenchymal atelectasis and bone fractures. Conclusions: When imaging findings, showing cases that needed to be quickly diagnosed, were investigated, chest X-ray and chest CT findings were matched at a high rate in patients with an appropriate imaging technique. However, chest X-rays, evaluated in the emergency department, were frequently taken with an inappropriate technique.

Keywords: chest x-ray, chest computerized tomography, chest imaging, emergency department

Procedia PDF Downloads 160