Search results for: graphical user interface
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3472

Search results for: graphical user interface

442 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe

Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis

Abstract:

The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.

Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM

Procedia PDF Downloads 408
441 Exploring the Perspective of Service Quality in mHealth Services during the COVID-19 Pandemic

Authors: Wan-I Lee, Nelio Mendoza Figueredo

Abstract:

The impact of COVID-19 has a significant effect on all sectors of society globally. Health information technology (HIT) has become an effective health strategy in this age of distancing. In this regard, Mobile Health (mHealth) plays a critical role in managing patient and provider workflows during the COVID-19 pandemic. Therefore, the users' perception of service quality about mHealth services plays a significant role in shaping confidence and subsequent behaviors regarding the mHealth users' intention of use. This study's objective was to explore levels of user attributes analyzed by a qualitative method of how health practitioners and patients are satisfied or dissatisfied with using mHealth services; and analyzed the users' intention in the context of Taiwan during the COVID-19 pandemic. This research explores the experienced usability of a mHealth services during the Covid-19 pandemic. This study uses qualitative methods that include in-depth and semi-structured interviews that investigate participants' perceptions and experiences and the meanings they attribute to them. The five cases consisted of health practitioners, clinic staff, and patients' experiences using mHealth services. This study encourages participants to discuss issues related to the research question by asking open-ended questions, usually in one-to-one interviews. The findings show the positive and negative attributes of mHealth service quality. Hence, the significant importance of patients' and health practitioners' issues on several dimensions of perceived service quality is system quality, information quality, and interaction quality. A concept map for perceptions regards to emergency uses' intention of mHealth services process is depicted. The findings revealed that users pay more attention to "Medical care", "ease of use" and "utilitarian benefits" and have less importance for "Admissions and Convenience" and "Social influence". To improve mHealth services, the mHealth providers and health practitioners should better manage users' experiences to enhance mHealth services. This research contributes to the understanding of service quality issues in mHealth services during the COVID-19 pandemic.

Keywords: COVID-19, mobile health, service quality, use intention

Procedia PDF Downloads 131
440 Utilizing Dowel-Laminated Mass Timber Components in Residential Multifamily Structures: A Case Study

Authors: Theodore Panton

Abstract:

As cities in the United States experience critical housing shortages, mass timber presents the opportunity to address this crisis in housing supply while taking advantage of the carbon-positive benefits of sustainably forested wood fiber. Mass timber, however, currently has a low level of adoption in residential multifamily structures due to the risk-averse nature of change within the construction financing, Architecture / Engineering / Contracting (AEC) communities, as well as various agency approval challenges. This study demonstrates how mass timber can be used within the cost and feasibility parameters of a typical multistory residential structure and ultimately address the need for dense urban housing. This study will utilize The Garden District, a mixed-use market-rate housing project in Woodinville, Washington, as a case study to illuminate the potential of mass timber in this application. The Garden District is currently in final stages of permit approval and will commence construction in 2023. It will be the tallest dowel-laminated timber (DLT) residential structure in the United States when completed. This case study includes economic, technical, and design reference points to demonstrate the relevance of the use of this system and its ability to deliver “triple bottom line” results. In terms of results, the study establishes scalable and repeatable approaches to project design and delivery of mass timber in multifamily residential uses and includes economic data, technical solutions, and a summary of end-user advantages. This study discusses the third party tested systems for satisfying acoustical requirements within dwelling units, a key to resolving the use of mass timber within multistory residential use. Lastly, the study will also compare the mass timber solution with a comparable cold formed steel (CFS) system with a similar program, which indicates a net carbon savings of over three million tons over the life cycle of the building.

Keywords: DLT, dowell laminated timber, mass timber, market rate multifamily

Procedia PDF Downloads 102
439 Hansen Solubility Parameter from Surface Measurements

Authors: Neveen AlQasas, Daniel Johnson

Abstract:

Membranes for water treatment are an established technology that attracts great attention due to its simplicity and cost effectiveness. However, membranes in operation suffer from the adverse effect of membrane fouling. Bio-fouling is a phenomenon that occurs at the water-membrane interface, and is a dynamic process that is initiated by the adsorption of dissolved organic material, including biomacromolecules, on the membrane surface. After initiation, attachment of microorganisms occurs, followed by biofilm growth. The biofilm blocks the pores of the membrane and consequently results in reducing the water flux. Moreover, the presence of a fouling layer can have a substantial impact on the membrane separation properties. Understanding the mechanism of the initiation phase of biofouling is a key point in eliminating the biofouling on membrane surfaces. The adhesion and attachment of different fouling materials is affected by the surface properties of the membrane materials. Therefore, surface properties of different polymeric materials had been studied in terms of their surface energies and Hansen solubility parameters (HSP). The difference between the combined HSP parameters (HSP distance) allows prediction of the affinity of two materials to each other. The possibilities of measuring the HSP of different polymer films via surface measurements, such as contact angle has been thoroughly investigated. Knowing the HSP of a membrane material and the HSP of a specific foulant, facilitate the estimation of the HSP distance between the two, and therefore the strength of attachment to the surface. Contact angle measurements using fourteen different solvents on five different polymeric films were carried out using the sessile drop method. Solvents were ranked as good or bad solvents using different ranking method and ranking was used to calculate the HSP of each polymeric film. Results clearly indicate the absence of a direct relation between contact angle values of each film and the HSP distance between each polymer film and the solvents used. Therefore, estimating HSP via contact angle alone is not sufficient. However, it was found if the surface tensions and viscosities of the used solvents are taken in to the account in the analysis of the contact angle values, a prediction of the HSP from contact angle measurements is possible. This was carried out via training of a neural network model. The trained neural network model has three inputs, contact angle value, surface tension and viscosity of solvent used. The model is able to predict the HSP distance between the used solvent and the tested polymer (material). The HSP distance prediction is further used to estimate the total and individual HSP parameters of each tested material. The results showed an accuracy of about 90% for all the five studied films

Keywords: surface characterization, hansen solubility parameter estimation, contact angle measurements, artificial neural network model, surface measurements

Procedia PDF Downloads 76
438 Teaching Ethnic Relations in Social Work Education: A Study of Teachers' Strategies and Experiences in Sweden

Authors: Helene Jacobson Pettersson, Linda Lill

Abstract:

Demographic changes and globalization in society provide new opportunities for social work and social work education in Sweden. There has been an ambition to include these aspects into the Swedish social work education. However, the Swedish welfare state standard continued to be as affectionate as invisible starting point in discussions about people’s way of life and social problems. The aim of this study is to explore content given to ethnic relations in social work in the social work education in Sweden. Our standpoint is that the subject can be understood both from individual and structural levels, it changes over time, varies in different steering documents and differs from the perspectives of teachers and students. Our question is what content is given to ethnic relations in social work by the teachers in their strategies and teaching material. The study brings together research in the interface between education science, social work and research of international migration and ethnic relations. The presented narratives are from longer interviews with a total of 17 university teachers who teach in social work program at four different universities in Sweden. The universities have in different ways a curriculum that involves the theme of ethnic relations in social work, and the interviewed teachers are teaching and grading social workers on specific courses related to ethnic relations at undergraduate and graduate levels. Overall assesses these 17 teachers a large number of students during a semester. The questions were concerned on how the teachers handle ethnic relations in education in social work. The particular focus during the interviews has been the teacher's understanding of the documented learning objectives and content of literature and how this has implications for their teaching. What emerges is the teachers' own stories about the educational work and how they relate to the content of teaching, as well as the teaching strategies they use to promote the theme of ethnic relations in social work education. The analysis of this kind of pedagogy is that the teaching ends up at an individual level with a particular focus on the professional encounter with individuals. We can see the shortage of a critical analysis of the construction of social problems. The conclusion is that individual circumstance precedes theoretical perspective on social problems related to migration, transnational relations, globalization and social. This result has problematic implications from the perspective of sustainability in terms of ethnic diversity and integration in society. Thus these aspects have most relevance for social workers’ professional acting in social support and empowerment related activities, in supporting the social status and human rights and equality for immigrants.

Keywords: ethnic relations in Swedish social work education, teaching content, teaching strategies, educating for change, human rights and equality

Procedia PDF Downloads 237
437 The Sociology of the Facebook: An Exploratory Study

Authors: Liana Melissa E. de la Rosa, Jayson P. Ada

Abstract:

This exploratory study was conducted to determine the sociology of the Facebook. Specifically, it aimed to know the socio-demographic profile of the respondents in terms of age, sex, year level and monthly allowance; find out the common usage of Facebook to the respondents; identify the features of Facebook that are commonly used by the respondents; understand the benefits and risks of using the Facebook; determine how frequent the respondents use the Facebook; and find out if there is a significant relationship between socio-demographic profile of the respondents and their Facebook usage. This study used the exploratory research design and correlational design employing research survey questionnaire as its main data gathering instrument. Students of the University of Eastern Philippines were selected as the respondents of this study through quota sampling. Ten (10) students were randomly selected from each college of the university. Based on the findings of this study, the following conclusion were drawn: The majority of the respondents are aged 18 and 21 old, female, are third year students, and have monthly allowance of P 2,000 above. On the respondents’ usage of Facebook, the majority of use the Facebook on a daily basis for one to two (1-2) hours everyday. And most users used Facebook by renting a computer in an internet cafe. On the use of Facebook, most users have created their profiles mainly to connect with people and gain new friends. The most commonly used features of Facebook, are: photos application, like button, wall, notification, friend, chat, network, groups and “like” pages status updates, messages and inbox and events. While the other Facebook features that are seldom used by the respondents are games, news feed, user name, video sharing and notes. And the least used Facebook features are questions, poke feature, credits and the market place. The respondents stated that the major benefit that the Facebook has given to its users is its ability to keep in touch with family members or friends while the main risk identified is that the users can become addicted to the Internet. On the tests of relationships between the respondents’ use of Facebook and the four (4) socio-demographic profile variables: age, sex, year level, and month allowance, were found to be not significantly related to the respondents’ use of the Facebook. While the variable found to be significantly related was gender.

Keywords: Facebook, sociology, social networking, exploratory study

Procedia PDF Downloads 268
436 Salmon Diseases Connectivity between Fish Farm Management Areas in Chile

Authors: Pablo Reche

Abstract:

Since 1980’s aquaculture has become the biggest economic activity in southern Chile, being Salmo salar and Oncorhynchus mykiss the main finfish species. High fish density makes both species prone to contract diseases, what drives the industry to big losses, affecting greatly the local economy. Three are the most concerning infective agents, the infectious salmon anemia virus (ISAv), the bacteria Piscirickettsia salmonis and the copepod Caligus rogercresseyi. To regulate the industry the government arranged the salmon farms within management areas named as barrios, which coordinate the fallowing periods and antibiotics treatments of their salmon farms. In turn, barrios are gathered into larger management areas, named as macrozonas whose purpose is to minimize the risk of disease transmission between them and to enclose the outbreaks within their boundaries. However, disease outbreaks still happen and transmission to neighbor sites enlarges the initial event. Salmon disease agents are mostly transported passively by local currents. Thus, to understand how transmission occurs it must be firstly studied the physical environment. In Chile, salmon farming takes place in the inner seas of the southernmost regions of western Patagonia, between 41.5ºS-55ºS. This coastal marine system is characterised by western winds, latitudinally modulated by the position of the South-Eats Pacific high-pressure centre, high precipitation rates and freshwater inflows from the numerous glaciers (including the largest ice cap out of Antarctic and Greenland). All of these forcings meet in a complex bathymetry and coastline system - deep fjords, shallow sills, narrow straits, channels, archipelagos, inlets, and isolated inner seas- driving an estuarine circulation (fast outflows westwards on surface and slow deeper inflows eastwards). Such a complex system is modelled on the numerical model MIKE3, upon whose 3D current fields particle-track-biological models (one for each infective agent) are decoupled. Each agent biology is parameterized by functions for maturation and mortality (reproduction not included). Such parameterizations are depending upon environmental factors, like temperature and salinity, so their lifespan will depend upon the environmental conditions those virtual agents encounter on their way while passively transported. CLIC (Connectivity-Langrangian–IFOP-Chile) is a service platform that supports the graphical visualization of the connectivity matrices calculated from the particle trajectories files resultant of the particle-track-biological models. On CLIC users can select, from a high-resolution grid (~1km), the areas the connectivity will be calculated between them. These areas can be barrios and macrozonas. Users also can select what nodes of these areas are allowed to release and scatter particles from, depth and frequency of the initial particle release, climatic scenario (winter/summer) and type of particle (ISAv, Piscirickettsia salmonis, Caligus rogercresseyi plus an option for lifeless particles). Results include probabilities downstream (where the particles go) and upstream (where the particles come from), particle age and vertical distribution, all of them aiming to understand how currently connectivity works to eventually propose a minimum risk zonation for aquaculture purpose. Preliminary results in Chiloe inner sea shows that the risk depends not only upon dynamic conditions but upon barrios location with respect to their neighbors.

Keywords: aquaculture zonation, Caligus rogercresseyi, Chilean Patagonia, coastal oceanography, connectivity, infectious salmon anemia virus, Piscirickettsia salmonis

Procedia PDF Downloads 143
435 Health Economics in the Cost-Benefit Analysis of Transport Schemes

Authors: Henry Kelly, Helena Shaw

Abstract:

This paper will seek how innovative methods from Health Economics and, to a lesser extent, wellbeing analysis can be applied in the Cost-Benefit Analysis (CBA) of transport infrastructure and policy interventions. The context for this will focus on the framework articulated by the UK Treasury (finance department) and the English Department for Transport. Both have well-established methods for undertaking CBA, but there is increased policy interest, particularly at a regional level of exploring broader strategic goals beyond those traditionally associated with transport user benefits, productivity gains, and labour market access. Links to different CBA approaches internationally, such as New Zealand, France, and Wales will be referenced. By exploring a complementary method of accessing the impacts of policies through the quantification of health impacts is a fruitful line to explore. In a previous piece of work, 14 impact pathways were identified, mapping the relationship between transport and health. These are wide-ranging, from improved employment prospects, the stress of unreliable journey times, and air quality to isolation and loneliness. Importantly, we will consider these different measures of health from an intersectional point of view to ensure that the basis that remains in the health industry does not get translated across to this work. The objective is to explore how a CBA based on these pathways may, through quantifying forecast impacts in terms of Quality-Adjusted Life Years may, produce different findings than a standard approach. Of particular interest is how a health-based approach may have different distributional impacts on socio-economic groups and may favour distinct types of interventions. Consideration will be given to the degree this approach may double-count impacts or if it is possible to identify additional benefits to the established CBA approach. The investigation will explore a range of schemes, from a high-speed rail link, highway improvements, rural mobility hubs, and coach services to cycle lanes. The conclusions should aid the progression of methods concerning the assessment of publicly funded infrastructure projects.

Keywords: cost-benefit analysis, health, QALYs transport

Procedia PDF Downloads 64
434 Characterization of Aerosol Droplet in Absorption Columns to Avoid Amine Emissions

Authors: Hammad Majeed, Hanna Knuutila, Magne Hilestad, Hallvard Svendsen

Abstract:

Formation of aerosols can cause serious complications in industrial exhaust gas CO2 capture processes. SO3 present in the flue gas can cause aerosol formation in an absorption based capture process. Small mist droplets and fog formed can normally not be removed in conventional demisting equipment because their submicron size allows the particles or droplets to follow the gas flow. As a consequence of this aerosol based emissions in the order of grams per Nm3 have been identified from PCCC plants. In absorption processes aerosols are generated by spontaneous condensation or desublimation processes in supersaturated gas phases. Undesired aerosol development may lead to amine emissions many times larger than what would be encountered in a mist free gas phase in PCCC development. It is thus of crucial importance to understand the formation and build-up of these aerosols in order to mitigate the problem.Rigorous modelling of aerosol dynamics leads to a system of partial differential equations. In order to understand mechanics of a particle entering an absorber an implementation of the model is created in Matlab. The model predicts the droplet size, the droplet internal variable profiles and the mass transfer fluxes as function of position in the absorber. The Matlab model is based on a subclass method of weighted residuals for boundary value problems named, orthogonal collocation method. The model comprises a set of mass transfer equations for transferring components and the essential diffusion reaction equations to describe the droplet internal profiles for all relevant constituents. Also included is heat transfer across the interface and inside the droplet. This paper presents results describing the basic simulation tool for the characterization of aerosols formed in CO2 absorption columns and gives examples as to how various entering droplets grow or shrink through an absorber and how their composition changes with respect to time. Below are given some preliminary simulation results for an aerosol droplet composition and temperature profiles. Results: As an example a droplet of initial size of 3 microns, initially containing a 5M MEA, solution is exposed to an atmosphere free of MEA. Composition of the gas phase and temperature is changing with respect to time throughout the absorber.

Keywords: amine solvents, emissions, global climate change, simulation and modelling, aerosol generation

Procedia PDF Downloads 248
433 Non-Cytotoxic Natural Sourced Inorganic Hydroxyapatite (HAp) Scaffold Facilitate Bone-like Mechanical Support and Cell Proliferation

Authors: Sudip Mondal, Biswanath Mondal, Sudit S. Mukhopadhyay, Apurba Dey

Abstract:

Bioactive materials improve devices for a long lifespan but have mechanical limitations. Mechanical characterization is one of the very important characteristics to evaluate the life span and functionality of the scaffold material. After implantation of scaffold material the primary stage rejection of scaffold occurs due to non biocompatible effect of host body system. The second major problems occur due to the effect of mechanical failure. The mechanical and biocompatibility failure of the scaffold materials can be overcome by the prior evaluation of the scaffold materials. In this study chemically treated Labeo rohita scale is used for synthesizing hydroxyapatite (HAp) biomaterial. Thermo-gravimetric and differential thermal analysis (TG-DTA) is carried out to ensure thermal stability. The chemical composition and bond structures of wet ball-milled calcined HAp powder is characterized by Fourier Transform Infrared spectroscopy (FTIR), X-ray Diffraction (XRD), Field Emission Scanning Electron Microscopy (FE-SEM), Transmission Electron Microscopy (TEM), Energy Dispersive X-ray (EDX) analysis. Fish scale derived apatite materials consists of nano-sized particles with Ca/P ratio of 1.71. The biocompatibility through cytotoxicity evaluation and MTT assay are carried out in MG63 osteoblast cell lines. In the cell attachment study, the cells are tightly attached with HAp scaffolds developed in the laboratory. The result clearly suggests that HAp material synthesized in this study do not have any cytotoxic effect, as well as it has a natural binding affinity for mammalian cell lines. The synthesized HAp powder further successfully used to develop porous scaffold material with suitable mechanical property of ~0.8GPa compressive stress, ~1.10 GPa a hardness and ~ 30-35% porosity which is acceptable for implantation in trauma region for animal model. The histological analysis also supports the bio-affinity of processed HAp biomaterials in Wistar rat model for investigating the contact reaction and stability at the artificial or natural prosthesis interface for biomedical function. This study suggests the natural sourced fish scale-derived HAp material could be used as a suitable alternative biomaterial for tissue engineering application in near future.

Keywords: biomaterials, hydroxyapatite, scaffold, mechanical property, tissue engineering

Procedia PDF Downloads 443
432 The Grand Egyptian Museum as a Cultural Interface

Authors: Mahmoud Moawad Mohamed Osman

Abstract:

The Egyptian civilization was and still is an inspiration for many human civilizations and modern sciences. For this reason, there is still a passion for the ancient Egyptian civilization. Due to the breadth and abundance of the outputs of the ancient Egyptian civilization, many museums have been established that contribute to displaying and demonstrating the splendor of the ancient Egyptian civilization, and among those museums is the Grand Egyptian Museum (Egypt's gift to the whole world). The idea of establishing the Grand Egyptian Museum began in the nineties of the last century, and in 2002 the foundation stone was laid for the museum project to be built in a privileged location overlooking the eternal pyramids of Giza, where the Egyptian state was declared, and under the auspices of the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the International Union of Architects. , for an international architectural competition for the best design for the museum. The current design submitted by Heneghan Peng Architects in Ireland won, and its design was based on the rays of the sun extending from the tops of the three pyramids when they meet to represent a conical mass, which is the Grand Egyptian Museum. The construction of the museum project began in May 2005, when the site was paved and prepared, and in 2006, the largest antiquities restoration center in the Middle East was established, dedicated to the restoration, preservation, maintenance and rehabilitation of the antiquities scheduled to be displayed in the museum halls, which was opened in 2010. The construction of the museum building, which has an area of more than 300,000 square meters, was completed during the year 2021, and includes a number of exhibition halls, each of which is considered larger than many current museums in Egypt and the world. The museum is considered one of the most important and greatest achievements of modern Egypt. It was created to be an integrated global civilizational, cultural and entertainment edifice, and to be the first destination for everyone interested in ancient Egyptian heritage, as the largest museum in the world that tells the story of the history of ancient Egyptian civilization, as it contains a large number of distinctive and unique artifacts, including the treasures of the golden king Tutankhamun, which... It is displayed for the first time in its entirety since the discovery of his tomb in November 1922, in addition to the collection of Queen Hetepheres, the guard of the mother of King Khufu, the builder of the Great Pyramid in Giza, as well as the Museum of King Khufu’s Boats, as well as various archaeological collectibles from the pre-dynastic era until the Greek and Roman eras.

Keywords: grand egyptian museum, egyptian civilization, education, museology

Procedia PDF Downloads 26
431 A Conceptual Model of the 'Driver – Highly Automated Vehicle' System

Authors: V. A. Dubovsky, V. V. Savchenko, A. A. Baryskevich

Abstract:

The current trend in the automotive industry towards automatic vehicles is creating new challenges related to human factors. This occurs due to the fact that the driver is increasingly relieved of the need to be constantly involved in driving the vehicle, which can negatively impact his/her situation awareness when manual control is required, and decrease driving skills and abilities. These new problems need to be studied in order to provide road safety during the transition towards self-driving vehicles. For this purpose, it is important to develop an appropriate conceptual model of the interaction between the driver and the automated vehicle, which could serve as a theoretical basis for the development of mathematical and simulation models to explore different aspects of driver behaviour in different road situations. Well-known driver behaviour models describe the impact of different stages of the driver's cognitive process on driving performance but do not describe how the driver controls and adjusts his actions. A more complete description of the driver's cognitive process, including the evaluation of the results of his/her actions, will make it possible to more accurately model various aspects of the human factor in different road situations. This paper presents a conceptual model of the 'driver – highly automated vehicle' system based on the P.K. Anokhin's theory of functional systems, which is a theoretical framework for describing internal processes in purposeful living systems based on such notions as goal, desired and actual results of the purposeful activity. A central feature of the proposed model is a dynamic coupling mechanism between the decision-making of a driver to perform a particular action and changes of road conditions due to driver’s actions. This mechanism is based on the stage by stage evaluation of the deviations of the actual values of the driver’s action results parameters from the expected values. The overall functional structure of the highly automated vehicle in the proposed model includes a driver/vehicle/environment state analyzer to coordinate the interaction between driver and vehicle. The proposed conceptual model can be used as a framework to investigate different aspects of human factors in transitions between automated and manual driving for future improvements in driving safety, and for understanding how driver-vehicle interface must be designed for comfort and safety. A major finding of this study is the demonstration that the theory of functional systems is promising and has the potential to describe the interaction of the driver with the vehicle and the environment.

Keywords: automated vehicle, driver behavior, human factors, human-machine system

Procedia PDF Downloads 126
430 Computation of Radiotherapy Treatment Plans Based on CT to ED Conversion Curves

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Radiotherapy treatment planning computers use CT data of the patient. For the computation of a treatment plan, treatment planning system must have an information on electron densities of tissues scanned by CT. This information is given by the conversion curve CT (CT number) to ED (electron density), or simply calibration curve. Every treatment planning system (TPS) has built in default CT to ED conversion curves, for the CTs of different manufacturers. However, it is always recommended to verify the CT to ED conversion curve before actual clinical use. Objective of this study was to check how the default curve already provided matches the curve actually measured on a specific CT, and how much it influences the calculation of a treatment planning computer. The examined CT scanners were from the same manufacturer, but four different scanners from three generations. The measurements of all calibration curves were done with the dedicated phantom CIRS 062M Electron Density Phantom. The phantom was scanned, and according to real HU values read at the CT console computer, CT to ED conversion curves were generated for different materials, for same tube voltage 140 kV. Another phantom, CIRS Thorax 002 LFC which represents an average human torso in proportion, density and two-dimensional structure, was used for verification. The treatment planning was done on CT slices of scanned CIRS LFC 002 phantom, for selected cases. Interest points were set in the lungs, and in the spinal cord, and doses recorded in TPS. The overall calculated treatment times for four scanners and default scanner did not differ more than 0.8%. Overall interest point dose in bone differed max 0.6% while for single fields was maximum 2.7% (lateral field). Overall interest point dose in lungs differed max 1.1% while for single fields was maximum 2.6% (lateral field). It is known that user should verify the CT to ED conversion curve, but often, developing countries are facing lack of QA equipment, and often use default data provided. We have concluded that the CT to ED curves obtained differ in certain points of a curve, generally in the region of higher densities. This influences the treatment planning result which is not significant, but definitely does make difference in the calculated dose.

Keywords: Computation of treatment plan, conversion curve, radiotherapy, electron density

Procedia PDF Downloads 462
429 On Cloud Computing: A Review of the Features

Authors: Assem Abdel Hamed Mousa

Abstract:

The Internet of Things probably already influences your life. And if it doesn’t, it soon will, say computer scientists; Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing. Ubiquitous computing is essentially the term for human interaction with computers in virtually everything. Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences. The approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible; everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere. The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" built at Xerox PARC, 1988-1994. Several papers describe this work, and there are web pages for the Tabs and for the Boards (which are a commercial product now): Ubiquitous computing will drastically reduce the cost of digital devices and tasks for the average consumer. With labor intensive components such as processors and hard drives stored in the remote data centers powering the cloud , and with pooled resources giving individual consumers the benefits of economies of scale, monthly fees similar to a cable bill for services that feed into a consumer’s phone.

Keywords: internet, cloud computing, ubiquitous computing, big data

Procedia PDF Downloads 369
428 3D Codes for Unsteady Interaction Problems of Continuous Mechanics in Euler Variables

Authors: M. Abuziarov

Abstract:

The designed complex is intended for the numerical simulation of fast dynamic processes of interaction of heterogeneous environments susceptible to the significant formability. The main challenges in solving such problems are associated with the construction of the numerical meshes. Currently, there are two basic approaches to solve this problem. One is using of Lagrangian or Lagrangian Eulerian grid associated with the boundaries of media and the second is associated with the fixed Eulerian mesh, boundary cells of which cut boundaries of the environment medium and requires the calculation of these cut volumes. Both approaches require the complex grid generators and significant time for preparing the code’s data for simulation. In this codes these problems are solved using two grids, regular fixed and mobile local Euler Lagrange - Eulerian (ALE approach) accompanying the contact and free boundaries, the surfaces of shock waves and phase transitions, and other possible features of solutions, with mutual interpolation of integrated parameters. For modeling of both liquids and gases, and deformable solids the Godunov scheme of increased accuracy is used in Lagrangian - Eulerian variables, the same for the Euler equations and for the Euler- Cauchy, describing the deformation of the solid. The increased accuracy of the scheme is achieved by using 3D spatial time dependent solution of the discontinuity problem (3D space time dependent Riemann's Problem solver). The same solution is used to calculate the interaction at the liquid-solid surface (Fluid Structure Interaction problem). The codes does not require complex 3D mesh generators, only the surfaces of the calculating objects as the STL files created by means of engineering graphics are given by the user, which greatly simplifies the preparing the task and makes it convenient to use directly by the designer at the design stage. The results of the test solutions and applications related to the generation and extension of the detonation and shock waves, loading the constructions are presented.

Keywords: fluid structure interaction, Riemann's solver, Euler variables, 3D codes

Procedia PDF Downloads 423
427 In silico Designing of Imidazo [4,5-b] Pyridine as a Probable Lead for Potent Decaprenyl Phosphoryl-β-D-Ribose 2′-Epimerase (DprE1) Inhibitors as Antitubercular Agents

Authors: Jineetkumar Gawad, Chandrakant Bonde

Abstract:

Tuberculosis (TB) is a major worldwide concern whose control has been exacerbated by HIV, the rise of multidrug-resistance (MDR-TB) and extensively drug resistance (XDR-TB) strains of Mycobacterium tuberculosis. The interest for newer and faster acting antitubercular drugs are more remarkable than any time. To search potent compounds is need and challenge for researchers. Here, we tried to design lead for inhibition of Decaprenyl phosphoryl-β-D-ribose 2′-epimerase (DprE1) enzyme. Arabinose is an essential constituent of mycobacterial cell wall. DprE1 is a flavoenzyme that converts decaprenylphosphoryl-D-ribose into decaprenylphosphoryl-2-keto-ribose, which is intermediate in biosynthetic pathway of arabinose. Latter, DprE2 converts keto-ribose into decaprenylphosphoryl-D-arabinose. We had a selection of 23 compounds from azaindole series for computational study, and they were drawn using marvisketch. Ligands were prepared using Maestro molecular modeling interface, Schrodinger, v10.5. Common pharmacophore hypotheses were developed by applying dataset thresholds to yield active and inactive set of compounds. There were 326 hypotheses were developed. On the basis of survival score, ADRRR (Survival Score: 5.453) was selected. Selected pharmacophore hypotheses were subjected to virtual screening results into 1000 hits. Hits were prepared and docked with protein 4KW5 (oxydoreductase inhibitor) was downloaded in .pdb format from RCSB Protein Data Bank. Protein was prepared using protein preparation wizard. Protein was preprocessed, the workspace was analyzed using force field OPLS 2005. Glide grid was generated by picking single atom in molecule. Prepared ligands were docked with prepared protein 4KW5 using Glide docking. After docking, on the basis of glide score top-five compounds were selected, (5223, 5812, 0661, 0662, and 2945) and the glide docking score (-8.928, -8.534, -8.412, -8.411, -8.351) respectively. There were interactions of ligand and protein, specifically HIS 132, LYS 418, TRY 230, ASN 385. Pi-pi stacking was observed in few compounds with basic Imidazo [4,5-b] pyridine ring. We had basic azaindole ring in parent compounds, but after glide docking, we received compounds with Imidazo [4,5-b] pyridine as a basic ring. That might be the new lead in the process of drug discovery.

Keywords: DprE1 inhibitors, in silico drug designing, imidazo [4, 5-b] pyridine, lead, tuberculosis

Procedia PDF Downloads 136
426 Slowness in Architecture: The Pace of Human Engagement with the Built Environment

Authors: Jaidev Tripathy

Abstract:

A human generation’s lifestyle, behaviors, habits, and actions are governed heavily by homogenous mindsets. But the current scenario is witnessing a rapid gap in this homogeneity as a result of an intervention, or rather, the dominance of the digital revolution in the human lifestyle. The current mindset for mass production, employment, multi-tasking, rapid involvement, and stiff competition to stay above the rest has led to a major shift in human consciousness. Architecture, as an entity, is being perceived differently. The screens are replacing the skies. The pace at which operation and evolution is taking place has increased. It is paradoxical, that time seems to be moving faster despite the intention to save time. Parallelly, there is an evident shift in architectural typologies spanning across different generations. The architecture of today is now seems influenced heavily from here and there. Mass production of buildings and over-exploitation of resources giving shape to uninspiring algorithmic designs, ambiguously catering to multiple user groups, has become a prevalent theme. Borrow-and-steal replaces influence, and the diminishing depth in today’s designs reflects a lack of understanding and connection. The digitally dominated world, perceived as an aid to connect and network, is making humans less capable of real-life interactions and understanding. It is not wrong, but it doesn’t seem right either. The engagement level between human beings and the built environment is a concern which surfaces. This leads to a question: Does human engagement drive architecture, or does architecture drive human engagement? This paper attempts to relook at architecture's capacity and its relativity with pace to influence the conscious decisions of a human being. Secondary research, supported with case examples, helps in understanding the translation of human engagement with the built environment through physicality of architecture. The procedure, or theme, is pace and the role of slowness in the context of human behaviors, thus bridging the widening gap between the human race and the architecture themselves give shape to, avoiding a possible future dystopian world.

Keywords: junkspace, pace, perception, slowness

Procedia PDF Downloads 96
425 Finite Element Molecular Modeling: A Structural Method for Large Deformations

Authors: A. Rezaei, M. Huisman, W. Van Paepegem

Abstract:

Atomic interactions in molecular systems are mainly studied by particle mechanics. Nevertheless, researches have also put on considerable effort to simulate them using continuum methods. In early 2000, simple equivalent finite element models have been developed to study the mechanical properties of carbon nanotubes and graphene in composite materials. Afterward, many researchers have employed similar structural simulation approaches to obtain mechanical properties of nanostructured materials, to simplify interface behavior of fiber-reinforced composites, and to simulate defects in carbon nanotubes or graphene sheets, etc. These structural approaches, however, are limited to small deformations due to complicated local rotational coordinates. This article proposes a method for the finite element simulation of molecular mechanics. For ease in addressing the approach, here it is called Structural Finite Element Molecular Modeling (SFEMM). SFEMM method improves the available structural approaches for large deformations, without using any rotational degrees of freedom. Moreover, the method simulates molecular conformation, which is a big advantage over the previous approaches. Technically, this method uses nonlinear multipoint constraints to simulate kinematics of the atomic multibody interactions. Only truss elements are employed, and the bond potentials are implemented through constitutive material models. Because the equilibrium bond- length, bond angles, and bond-torsion potential energies are intrinsic material parameters, the model is independent of initial strains or stresses. In this paper, the SFEMM method has been implemented in ABAQUS finite element software. The constraints and material behaviors are modeled through two Fortran subroutines. The method is verified for the bond-stretch, bond-angle and bond-torsion of carbon atoms. Furthermore, the capability of the method in the conformation simulation of molecular structures is demonstrated via a case study of a graphene sheet. Briefly, SFEMM builds up a framework that offers more flexible features over the conventional molecular finite element models, serving the structural relaxation modeling and large deformations without incorporating local rotational degrees of freedom. Potentially, the method is a big step towards comprehensive molecular modeling with finite element technique, and thereby concurrently coupling an atomistic domain to a solid continuum domain within a single finite element platform.

Keywords: finite element, large deformation, molecular mechanics, structural method

Procedia PDF Downloads 135
424 The Development of the First Inter-Agency Residential Rehabilitation Service for Gambling Disorder with Complex Clinical Needs

Authors: Dragos Dragomir-Stanciu, Leon Marsh

Abstract:

Background As a response to the gaps identified in recent research in the provision of residential care to address co-occurring health needs, including mental health problems and complexities Gamble Aware has facilitated the possibility to provide a new service which would extend the NGTS provision of residential rehabilitation for gambling disorder with complex and co-morbid presentation. Gordon Moody, together with Adferiad have been successful in securing the tender for this service and this presentation aims to introduce FOLD, the resulting model of treatment developed for the delivery of the service. Setting As a partnership, we have come together to coproduce a model which allows us to share our clinical and industry knowledge and build on our reputations as trusted treatment providers. The presentation will outline our expertise share in development of a unified approach to recovery-oriented models of care, clinical governance, risk assessment and management and aftercare and continuous recovery. We will also introduce our innovative specialist referral portal which will offer referring partners the ability to include the service user in planning their own recovery journey. Outcomes Our collaboration has resulted in the development of the FOLD model which includes three agile and flexible treatment packages aimed at offering the most enhanced and comprehensive treatment in UK, to date, for those most affected by gambling harm. The paper will offer insight into each treatment package and all recovery model stages involved, as well as into the partnership work with NGST providers, local mental health and social care providers and lived experience organisation that will enable us to offer support to more 100 people a year who would otherwise get “lost in the system”. Conclusion FOLD offers a great opportunity to develop, implement and evaluate a new, much needed, whole-person and whole-system approach to counter gambling related harms.

Keywords: gambling treatment, partnership working, integrated care pathways, NGTS, complex needs

Procedia PDF Downloads 113
423 Enhanced Photocatalytic Activities of TiO2/Ag2O Heterojunction Nanotubes Arrays Obtained by Electrochemical Method

Authors: Magdalena Diaka, Paweł Mazierski, Joanna Żebrowska, Michał Winiarski, Tomasz Klimczuk, Adriana Zaleska-Medynska

Abstract:

During the last years, TiO2 nanotubes have been widely studied due to their unique highly ordered array structure, unidirectional charge transfer and higher specific surface area compared to conventional TiO2 powder. These photoactive materials, in the form of thin layer, can be activated by low powered and low cost irradiation sources (such as LEDs) to remove VOCs, microorganism and to deodorize air streams. This is possible due to their directly growth on a support material and high surface area, which guarantee enhanced photon absorption together with an extensive adsorption of reactant molecules on the photocatalyst surface. TiO2 nanotubes exhibit also lots of other attractive properties, such as potential enhancement of electron percolation pathways, light conversion, and ion diffusion at the semiconductor-electrolyte interface. Pure TiO2 nanotubes were previously used to remove organic compounds from the gas phase as well as in water splitting reaction. The major factors limiting the use of TiO2 nanotubes, which have not been fully overcome, are their relatively large band gap (3-3,2 eV) and high recombination rate of photogenerated electron–hole pairs. Many different strategies were proposed to solve this problem, however titania nanostructures containing incorporated metal oxides like Ag2O shows very promising, new optical and photocatalytic properties. Unfortunately, there is still very limited number of reports regarding application of TiO2/MxOy nanostructures. In the present work, we prepared TiO2/Ag2O nanotubes obtained by anodization of Ti-Ag alloys containing 5, 10 and 15 wt. % Ag. Photocatalysts prepared in this way were characterized by X-ray diffraction spectroscopy (XRD), scanning electron microscopy (SEM), luminescence spectroscopy and UV-Vis spectroscopy. The activities of new TiO2/Ag2O were examined by photocatalytic degradation of toluene in gas phase reaction and phenol in aqueous phase using 1000 W Xenon lamp (Oriel) and light emitting diodes (LED) as a irradiation sources. Additionally efficiency of bacteria (Pseudomonas aeruginosa) removal from the gas phase was estimated. The number of surviving bacteria was determined by the serial twofold dilution microtiter plate method, in Tryptic Soy Broth medium (TSB, GibcoBRL).

Keywords: photocatalysis, antibacterial properties, titania nanotubes, new TiO2/MxOy nanostructures

Procedia PDF Downloads 282
422 An Analytical Systematic Design Approach to Evaluate Ballistic Performance of Armour Grade AA7075 Aluminium Alloy Using Friction Stir Processing

Authors: Lahari Ramya Pa, Sudhakar Ib, Madhu Vc, Madhusudhan Reddy Gd, Srinivasa Rao E.

Abstract:

Selection of suitable armor materials for defense applications is very crucial with respect to increasing mobility of the systems as well as maintaining safety. Therefore, determining the material with the lowest possible areal density that resists the predefined threat successfully is required in armor design studies. A number of light metal and alloys are come in to forefront especially to substitute the armour grade steels. AA5083 aluminium alloy which fit in to the military standards imposed by USA army is foremost nonferrous alloy to consider for possible replacement of steel to increase the mobility of armour vehicles and enhance fuel economy. Growing need of AA5083 aluminium alloy paves a way to develop supplement aluminium alloys maintaining the military standards. It has been witnessed that AA 2xxx aluminium alloy, AA6xxx aluminium alloy and AA7xxx aluminium alloy are the potential material to supplement AA5083 aluminium alloy. Among those cited aluminium series alloys AA7xxx aluminium alloy (heat treatable) possesses high strength and can compete with armour grade steels. Earlier investigations revealed that layering of AA7xxx aluminium alloy can prevent spalling of rear portion of armour during ballistic impacts. Hence, present investigation deals with fabrication of hard layer (made of boron carbide) i.e. layer on AA 7075 aluminium alloy using friction stir processing with an intention of blunting the projectile in the initial impact and backing tough portion(AA7xxx aluminium alloy) to dissipate residual kinetic energy. An analytical approach has been adopted to unfold the ballistic performance of projectile. Penetration of projectile inside the armour has been resolved by considering by strain energy model analysis. Perforation shearing areas i.e. interface of projectile and armour is taken in to account for evaluation of penetration inside the armour. Fabricated surface composites (targets) were tested as per the military standard (JIS.0108.01) in a ballistic testing tunnel at Defence Metallurgical Research Laboratory (DMRL), Hyderabad in standardized testing conditions. Analytical results were well validated with experimental obtained one.

Keywords: AA7075 aluminium alloy, friction stir processing, boron carbide, ballistic performance, target

Procedia PDF Downloads 312
421 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 253
420 Disabilities in Railways: Proposed Changes to the Design of Railway Compartments for the Inclusion of Differently Abled Persons

Authors: Bathmajaa Muralisankar

Abstract:

As much as railway station infrastructure designs and ticket-booking norms have been changed to facilitate use by differently abled persons, the railway train compartments themselves have not been made user-friendly for differently abled persons. Owing to safety concerns, dependency on others for their travel, and fear of isolation, differently abled people do not prefer travelling by train. Rather than including a dedicated compartment open only to the differently abled, including the latter with others in the normal compartment (with the proposed modifications discussed here) will make them feel secure and make for an enhanced travel experience for them. This approach also represents the most practical way to include a particular category of people in the mainstream society. Lowering the height of the compartment doors and providing a wider entrance with a ramp will provide easy entry for those using wheelchairs. As well, removing the first two alternate rows and the first two side seats will not only widen the passage and increase seating space but also improve wheelchair turning radius. This will help them travel without having to depend on others. Seating arrangements may be done to accommodate their family members near them instead of isolating the differently abled in a separate compartment. According to present ticket-booking regulations of the Indian Railways, three to four disabled persons may travel without their family or one to two along with their family, and the numbers may be added or reduced. To help visually challenged and hearing-impaired persons, in addition to the provision of special instruments, railings, and textured footpaths and flooring, the seat numbers above the seats may be set in metal or plastic as an outward projection so the visually impaired can touch and feel the numbers. Braille boards may be included at the entrance to the compartment along with seat numbers in the aforementioned projected manner. These seat numbers may be designed as buttons, which when pressed results in an announcement of the seat number in the applicable local language as well as English. Emergency buttons, rather than emergency chains, within the easy reach of disabled passengers will also help them.

Keywords: dependency, differently abled, inclusion, mainstream society

Procedia PDF Downloads 241
419 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 398
418 Eco-Friendly Softener Extracted from Ricinus communis (Castor) Seeds for Organic Cotton Fabric

Authors: Fisaha Asmelash

Abstract:

The processing of textiles to achieve a desired handle is a crucial aspect of finishing technology. Softeners can enhance the properties of textiles, such as softness, smoothness, elasticity, hydrophilicity, antistatic properties, and soil release properties, depending on the chemical nature used. However, human skin is sensitive to rough textiles, making softeners increasingly important. Although synthetic softeners are available, they are often expensive and can cause allergic reactions on human skin. This paper aims to extract a natural softener from Ricinus communis and produce an eco-friendly and user-friendly alternative due to its 100% herbal and organic nature. Crushed Ricinus communis seeds were soaked in a mechanical oil extractor for one hour with a 100g cotton fabric sample. The defatted cake or residue obtained after the extraction of oil from the seeds, also known as Ricinus communis meal, was obtained by filtering the raffinate and then dried at 1030c for four hours before being stored under laboratory conditions for the softening process. The softener was applied directly to 100% cotton fabric using the padding process, and the fabric was tested for stiffness, crease recovery, and drape ability. The effect of different concentrations of finishing agents on fabric stiffness, crease recovery, and drape ability was also analyzed. The results showed that the change in fabric softness depends on the concentration of the finish used. As the concentration of the finish was increased, there was a decrease in bending length and drape coefficient. Fabrics with a high concentration of softener showed a maximum decrease in drape coefficient and stiffness, comparable to commercial softeners such as silicon. The highest decrease in drape coefficient was found to be comparable with commercial softeners, silicon. Maximum increases in crease recovery were seen in fabrics treated with Ricinus communis softener at a concentration of 30gpl. From the results, the extracted softener proved to be effective in the treatment of 100% cotton fabric

Keywords: ricinus communis, crease recovery, drapability, softeners, stiffness

Procedia PDF Downloads 72
417 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 152
416 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 174
415 Forecasting Future Society to Explore Promising Security Technologies

Authors: Jeonghwan Jeon, Mintak Han, Youngjun Kim

Abstract:

Due to the rapid development of information and communication technology (ICT), a substantial transformation is currently happening in the society. As the range of intelligent technologies and services is continuously expanding, ‘things’ are becoming capable of communicating one another and even with people. However, such “Internet of Things” has the technical weakness so that a great amount of such information transferred in real-time may be widely exposed to the threat of security. User’s personal data are a typical example which is faced with a serious security threat. The threats of security will be diversified and arose more frequently because next generation of unfamiliar technology develops. Moreover, as the society is becoming increasingly complex, security vulnerability will be increased as well. In the existing literature, a considerable number of private and public reports that forecast future society have been published as a precedent step of the selection of future technology and the establishment of strategies for competitiveness. Although there are previous studies that forecast security technology, they have focused only on technical issues and overlooked the interrelationships between security technology and social factors are. Therefore, investigations of security threats in the future and security technology that is able to protect people from various threats are required. In response, this study aims to derive potential security threats associated with the development of technology and to explore the security technology that can protect against them. To do this, first of all, private and public reports that forecast future and online documents from technology-related communities are collected. By analyzing the data, future issues are extracted and categorized in terms of STEEP (Society, Technology, Economy, Environment, and Politics), as well as security. Second, the components of potential security threats are developed based on classified future issues. Then, points that the security threats may occur –for example, mobile payment system based on a finger scan technology– are identified. Lastly, alternatives that prevent potential security threats are proposed by matching security threats with points and investigating related security technologies from patent data. Proposed approach can identify the ICT-related latent security menaces and provide the guidelines in the ‘problem – alternative’ form by linking the threat point with security technologies.

Keywords: future society, information and communication technology, security technology, technology forecasting

Procedia PDF Downloads 452
414 Multi-Agent System Based Solution for Operating Agile and Customizable Micro Manufacturing Systems

Authors: Dylan Santos De Pinho, Arnaud Gay De Combes, Matthieu Steuhlet, Claude Jeannerat, Nabil Ouerhani

Abstract:

The Industry 4.0 initiative has been launched to address huge challenges related to ever-smaller batch sizes. The end-user need for highly customized products requires highly adaptive production systems in order to keep the same efficiency of shop floors. Most of the classical Software solutions that operate the manufacturing processes in a shop floor are based on rigid Manufacturing Execution Systems (MES), which are not capable to adapt the production order on the fly depending on changing demands and or conditions. In this paper, we present a highly modular and flexible solution to orchestrate a set of production systems composed of a micro-milling machine-tool, a polishing station, a cleaning station, a part inspection station, and a rough material store. The different stations are installed according to a novel matrix configuration of a 3x3 vertical shelf. The different cells of the shelf are connected through horizontal and vertical rails on which a set of shuttles circulate to transport the machined parts from a station to another. Our software solution for orchestrating the tasks of each station is based on a Multi-Agent System. Each station and each shuttle is operated by an autonomous agent. All agents communicate with a central agent that holds all the information about the manufacturing order. The core innovation of this paper lies in the path planning of the different shuttles with two major objectives: 1) reduce the waiting time of stations and thus reduce the cycle time of the entire part, and 2) reduce the disturbances like vibration generated by the shuttles, which highly impacts the manufacturing process and thus the quality of the final part. Simulation results show that the cycle time of the parts is reduced by up to 50% compared with MES operated linear production lines while the disturbance is systematically avoided for the critical stations like the milling machine-tool.

Keywords: multi-agent systems, micro-manufacturing, flexible manufacturing, transfer systems

Procedia PDF Downloads 119
413 21st Century Business Dynamics: Acting Local and Thinking Global through Extensive Business Reporting Language (XBRL)

Authors: Samuel Faboyede, Obiamaka Nwobu, Samuel Fakile, Dickson Mukoro

Abstract:

In the present dynamic business environment of corporate governance and regulations, financial reporting is an inevitable and extremely significant process for every business enterprise. Several financial elements such as Annual Reports, Quarterly Reports, ad-hoc filing, and other statutory/regulatory reports provide vital information to the investors and regulators, and establish trust and rapport between the internal and external stakeholders of an organization. Investors today are very demanding, and emphasize greatly on authenticity, accuracy, and reliability of financial data. For many companies, the Internet plays a key role in communicating business information, internally to management and externally to stakeholders. Despite high prominence being attached to external reporting, it is disconnected in most companies, who generate their external financial documents manually, resulting in high degree of errors and prolonged cycle times. Chief Executive Officers and Chief Financial Officers are increasingly susceptible to endorsing error-laden reports, late filing of reports, and non-compliance with regulatory acts. There is a lack of common platform to manage the sensitive information – internally and externally – in financial reports. The Internet financial reporting language known as eXtensible Business Reporting Language (XBRL) continues to develop in the face of challenges and has now reached the point where much of its promised benefits are available. This paper looks at the emergence of this revolutionary twenty-first century language of digital reporting. It posits that today, the world is on the brink of an Internet revolution that will redefine the ‘business reporting’ paradigm. The new Internet technology, eXtensible Business Reporting Language (XBRL), is already being deployed and used across the world. It finds that XBRL is an eXtensible Markup Language (XML) based information format that places self-describing tags around discrete pieces of business information. Once tags are assigned, it is possible to extract only desired information, rather than having to download or print an entire document. XBRL is platform-independent and it will work on any current or recent-year operating system, or any computer and interface with virtually any software. The paper concludes that corporate stakeholders and the government cannot afford to ignore the XBRL. It therefore recommends that all must act locally and think globally now via the adoption of XBRL that is changing the face of worldwide business reporting.

Keywords: XBRL, financial reporting, internet, internal and external reports

Procedia PDF Downloads 265