Search results for: software transactional memory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5817

Search results for: software transactional memory

4437 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software

Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah

Abstract:

With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.

Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D

Procedia PDF Downloads 97
4436 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 261
4435 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: electromagnetic sensor, accurately, data acquisition, position measurement

Procedia PDF Downloads 273
4434 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 205
4433 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water

Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer

Abstract:

Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.

Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software

Procedia PDF Downloads 61
4432 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 345
4431 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 53
4430 In Silico Screening, Identification and Validation of Cryptosporidium hominis Hypothetical Protein and Virtual Screening of Inhibitors as Therapeutics

Authors: Arpit Kumar Shrivastava, Subrat Kumar, Rajani Kanta Mohapatra, Priyadarshi Soumyaranjan Sahu

Abstract:

Computational approaches to predict structure, function and other biological characteristics of proteins are becoming more common in comparison to the traditional methods in drug discovery. Cryptosporidiosis is a major zoonotic diarrheal disease particularly in children, which is caused primarily by Cryptosporidium hominis and Cryptosporidium parvum. Currently, there are no vaccines for cryptosporidiosis and recommended drugs are not effective. With the availability of complete genome sequence of C. hominis, new targets have been recognized for the development of effective and better drugs and/or vaccines. We identified a unique hypothetical epitopic protein in C. hominis genome through BLASTP analysis. A 3D model of the hypothetical protein was generated using I-Tasser server through threading methodology. The quality of the model was validated through Ramachandran plot by PROCHECK server. The functional annotation of the hypothetical protein through DALI server revealed structural similarity with human Transportin 3. Phylogenetic analysis for this hypothetical protein also showed C. hominis hypothetical protein (CUV04613) was the closely related to human transportin 3 protein. The 3D protein model is further subjected to virtual screening study with inhibitors from the Zinc Database by using Dock Blaster software. Docking study reported N-(3-chlorobenzyl) ethane-1,2-diamine as the best inhibitor in terms of docking score. Docking analysis elucidated that Leu 525, Ile 526, Glu 528, Glu 529 are critical residues for ligand–receptor interactions. The molecular dynamic simulation was done to access the reliability of the binding pose of inhibitor and protein complex using GROMACS software at 10ns time point. Trajectories were analyzed at each 2.5 ns time interval, among which, H-bond with LEU-525 and GLY- 530 are significantly present in MD trajectories. Furthermore, antigenic determinants of the protein were determined with the help of DNA Star software. Our study findings showed a great potential in order to provide insights in the development of new drug(s) or vaccine(s) for control as well as prevention of cryptosporidiosis among humans and animals.

Keywords: cryptosporidium hominis, hypothetical protein, molecular docking, molecular dynamics simulation

Procedia PDF Downloads 351
4429 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures

Authors: Réka Sárközi, Péter Iványi, Attila B. Széll

Abstract:

The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.

Keywords: parametric design, structural morphology, space structures, spherical coordinate system

Procedia PDF Downloads 238
4428 Chemical vs Visual Perception in Food Choice Ability of Octopus vulgaris (Cuvier, 1797)

Authors: Al Sayed Al Soudy, Valeria Maselli, Gianluca Polese, Anna Di Cosmo

Abstract:

Cephalopods are considered as a model organism with a rich behavioral repertoire. Sophisticated behaviors were widely studied and described in different species such as Octopus vulgaris, who has evolved the largest and more complex nervous system among invertebrates. In O. vulgaris, cognitive abilities in problem-solving tasks and learning abilities are associated with long-term memory and spatial memory, mediated by highly developed sensory organs. They are equipped with sophisticated eyes, able to discriminate colors even with a single photoreceptor type, vestibular system, ‘lateral line analogue’, primitive ‘hearing’ system and olfactory organs. They can recognize chemical cues either through direct contact with odors sources using suckers or by distance through the olfactory organs. Cephalopods are able to detect widespread waterborne molecules by the olfactory organs. However, many volatile odorant molecules are insoluble or have a very low solubility in water, and must be perceived by direct contact. O. vulgaris, equipped with many chemosensory neurons located in their suckers, exhibits a peculiar behavior that can be provocatively described as 'smell by touch'. The aim of this study is to establish the priority given to chemical vs. visual perception in food choice. Materials and methods: Three different types of food (anchovies, clams, and mussels) were used, and all sessions were recorded with a digital camera. During the acclimatization period, Octopuses were exposed to the three types of food to test their natural food preferences. Later, to verify if food preference is maintained, food was provided in transparent screw-jars with pierced lids to allow both visual and chemical recognition of the food inside. Subsequently, we tested alternatively octopuses with food in sealed transparent screw-jars and food in blind screw-jars with pierced lids. As a control, we used blind sealed jars with the same lid color to verify a random choice among food types. Results and discussion: During the acclimatization period, O. vulgaris shows a higher preference for anchovies (60%) followed by clams (30%), then mussels (10%). After acclimatization, using the transparent and pierced screw jars octopus’s food choices resulted in 50-50 between anchovies and clams, avoiding mussels. Later, guided by just visual sense, with transparent but not pierced jars, their food preferences resulted in 100% anchovies. With pierced but not transparent jars their food preference resulted in 100% anchovies as first food choice, the clams as a second food choice result (33.3%). With no possibility to select food, neither by vision nor by chemoreception, the results were 20% anchovies, 20% clams, and 60% mussels. We conclude that O. vulgaris uses both chemical and visual senses in an integrative way in food choice, but if we exclude one of them, it appears clear that its food preference relies on chemical sense more than on visual perception.

Keywords: food choice, Octopus vulgaris, olfaction, sensory organs, visual sense

Procedia PDF Downloads 201
4427 The Relationship between Knowledge Management Processes and Strategic Thinking at the Organization Level

Authors: Bahman Ghaderi, Hedayat Hosseini, Parviz Kafche

Abstract:

The role of knowledge management processes in achieving the strategic goals of organizations is crucial. To this end, understanding the relationship between knowledge management processes and different aspects of strategic thinking (followed by long-term organizational planning) should be considered. This research examines the relationship between each of the five knowledge management processes (creation, storage, transfer, audit, and deployment) with each dimension of strategic thinking (vision, creativity, thinking, communication and analysis) in one of the major sectors of the food industry in Iran. In this research, knowledge management and its dimensions (knowledge acquisition, knowledge storage, knowledge transfer, knowledge auditing, and finally knowledge utilization) as independent variables and strategic thinking and its dimensions (creativity, systematic thinking, vision, strategic analysis, and strategic communication) are considered as the dependent variable. The statistical population of this study consisted of 245 managers and employees of Minoo Food Industrial Group in Tehran. In this study, a simple random sampling method was used, and data were collected by a questionnaire designed by the research team. Data were analyzed using SPSS 21 software. LISERL software is also used for calculating and drawing models and graphs. Among the factors investigated in the present study, knowledge storage with 0.78 had the most effect, and knowledge transfer with 0.62 had the least effect on knowledge management and thus on strategic thinking.

Keywords: knowledge management, strategic thinking, knowledge management processes, food industry

Procedia PDF Downloads 156
4426 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital

Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri

Abstract:

Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.

Keywords: systems modeling, ED operation, workflow modeling, systems analysis

Procedia PDF Downloads 167
4425 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 121
4424 A Study on the Nostalgia Contents Analysis of Hometown Alumni in the Online Community

Authors: Heejin Yun, Juanjuan Zang

Abstract:

This study aims to analyze the text terms posted on an online community of people from the same hometown and to understand the topic and trend of nostalgia composed online. For this purpose, this study collected 144 writings which the natives of Yeongjong Island, Incheon, South-Korea have posted on an online community. And it analyzed association relations. As a result, online community texts means that just defining nostalgia as ‘a mind longing for hometown’ is not an enough explanation. Second, texts composed online have abstractness rather than persons’ individual stories. This study figured out the relationship that had the most critical and closest mutual association among the terms that constituted nostalgia through literature research and association rule concerning nostalgia. The result of this study has a characteristic that it summed up the core terms and emotions related to nostalgia.

Keywords: nostalgia, cultural memory, data mining, association rule

Procedia PDF Downloads 216
4423 Characterizing the Rectification Process for Designing Scoliosis Braces: Towards Digital Brace Design

Authors: Inigo Sanz-Pena, Shanika Arachchi, Dilani Dhammika, Sanjaya Mallikarachchi, Jeewantha S. Bandula, Alison H. McGregor, Nicolas Newell

Abstract:

The use of orthotic braces for adolescent idiopathic scoliosis (AIS) patients is the most common non-surgical treatment to prevent deformity progression. The traditional method to create an orthotic brace involves casting the patient’s torso to obtain a representative geometry, which is then rectified by an orthotist to the desired geometry of the brace. Recent improvements in 3D scanning technologies, rectification software, CNC, and additive manufacturing processes have given the possibility to compliment, or in some cases, replace manual methods with digital approaches. However, the rectification process remains dependent on the orthotist’s skills. Therefore, the rectification process needs to be carefully characterized to ensure that braces designed through a digital workflow are as efficient as those created using a manual process. The aim of this study is to compare 3D scans of patients with AIS against 3D scans of both pre- and post-rectified casts that have been manually shaped by an orthotist. Six AIS patients were recruited from the Ragama Rehabilitation Clinic, Colombo, Sri Lanka. All patients were between 10 and 15 years old, were skeletally immature (Risser grade 0-3), and had Cobb angles between 20-45°. Seven spherical markers were placed at key anatomical locations on each patient’s torso and on the pre- and post-rectified molds so that distances could be reliably measured. 3D scans were obtained of 1) the patient’s torso and pelvis, 2) the patient’s pre-rectification plaster mold, and 3) the patient’s post-rectification plaster mold using a Structure Sensor Mark II 3D scanner (Occipital Inc., USA). 3D stick body models were created for each scan to represent the distances between anatomical landmarks. The 3D stick models were used to analyze the changes in position and orientation of the anatomical landmarks between scans using Blender open-source software. 3D Surface deviation maps represented volume differences between the scans using CloudCompare open-source software. The 3D stick body models showed changes in the position and orientation of thorax anatomical landmarks between the patient and the post-rectification scans for all patients. Anatomical landmark position and volume differences were seen between 3D scans of the patient’s torsos and the pre-rectified molds. Between the pre- and post-rectified molds, material removal was consistently seen on the anterior side of the thorax and the lateral areas below the ribcage. Volume differences were seen in areas where the orthotist planned to place pressure pads (usually at the trochanter on the side to which the lumbar curve was tilted (trochanter pad), at the lumbar apical vertebra (lumbar pad), on the rib connected to the apical vertebrae at the mid-axillary line (thoracic pad), and on the ribs corresponding to the upper thoracic vertebra (axillary extension pad)). The rectification process requires the skill and experience of an orthotist; however, this study demonstrates that the brace shape, location, and volume of material removed from the pre-rectification mold can be characterized and quantified. Results from this study can be fed into software that can accelerate the brace design process and make steps towards the automated digital rectification process.

Keywords: additive manufacturing, orthotics, scoliosis brace design, sculpting software, spinal deformity

Procedia PDF Downloads 133
4422 Language Learning Strategies of Chinese Students at Suan Sunandha Rajabhat University in Thailand

Authors: Gunniga Anugkakul, Suwaree Yordchim

Abstract:

The objectives were to study language learning strategies (LLSs) employed by Chinese students, and the frequency of LLSs they used, and examine the relationship between the use of LLSs and gender. The Strategy Inventory for Language Learning (SILL) by Oxford was administered to thirty-six Chinese students at Suan Sunandha Rajabhat University in Thailand. The data obtained was analyzed using descriptive statistics and chi-square tests. Three useful findings were found on the use of LLSs reported by Chinese students. First, Chinese students used overall LLSs at a high level. Second, among the six strategy groups, Chinese students employed compensation strategy most frequently and memory strategy least frequently. Third, the research results also revealed that gender had significant effect on Chinese Student’s use of overall LLSs.

Keywords: English language, language learning strategy, Chinese students, compensation strategy

Procedia PDF Downloads 663
4421 An Application of E-Learning Technology for Students with Deafness and Hearing Impairment

Authors: Eyup Bayram Guzel

Abstract:

There have been growing awareness that technology offers unique and promising advantages by offering up-to-data educational materials in promoting teaching and learning materials, new strategies for building enhanced communication environment for people with disabilities and specifically for this study concentrated on the students with deafness and hearing impairments. Creating e-learning environment where teachers and students work in collaboration to develop better educational outcomes is the foremost reason of conducting this research. This study examined the perspectives of special education teachers’ regarding an application of e-learning software called Multimedia Builder on the students with deafness and hearing impairments. Initial and follow up interviews were conducted with 15 special education teachers around the scope of qualitative case study. Grounded approach has been used to analyse and interpret the data. The research results revealed that application of Multimedia Builder software were influential on reading, sign language, vocabulary improvements, computer and ICT usage developments and on audio-visual learning achievements for the advantages of students with deafness and hearing impairments. The implications of the study encouraged the ways of using e-learning tools and strategies to promote unique and comprehensive learning experiences for the targeted students and their teachers.

Keywords: e-learning, special education, deafness and hearing impairment, computer-ICT usage.

Procedia PDF Downloads 424
4420 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 287
4419 Use of Large Eddy Simulations Model to Simulate the Flow of Heavy Oil-Water-Air through Pipe

Authors: Salim Al Jadidi, Shian Gao, Shivananda Moolya

Abstract:

Computational Fluid Dynamic (CFD) technique coupled with Sub-Grid-Scale (SGS) model is used to study the flow behavior of heavy oil-water-air flow in a horizontal pipe by adapting ANSYS Fluent CFD software. The technique suitable for the transport of water-lubricated heavy viscous oil in a horizontal pipe is the Core Annular flow (CAF) technique. The present study focuses on the numerical study of CAF adapting Large Eddy Simulations (LES). The basic objective of the present study is to gain a basic knowledge of the flow behavior of heavy oil using turbulent CAF through a conventional horizontal pipe. This work also focuses on the success and applicability of LES. The simulation of heavy oil-water-air three-phase flow and two-phase flow of heavy oil–water in a conventional horizontal pipe is performed using ANSYS Fluent 16.2 software. The influence of three-phase heavy oil-water air flow in a selected pipe is affected by gravity. It is also observed from the result that the air phase and the variation in the temperature impact the behavior of the annular stream and pressure drop. Some results obtained during the study are validated with the results gained from part of the literature experiments and simulations, and the results show reasonably good agreement between the studies.

Keywords: computational fluid dynamics, gravity, heavy viscous oil, three-phase flow

Procedia PDF Downloads 63
4418 Design and Application of NFC-Based Identity and Access Management in Cloud Services

Authors: Shin-Jer Yang, Kai-Tai Yang

Abstract:

In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.

Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device

Procedia PDF Downloads 419
4417 A New Class of Conjugate Gradient Methods Based on a Modified Search Direction for Unconstrained Optimization

Authors: Belloufi Mohammed, Sellami Badreddine

Abstract:

Conjugate gradient methods have played a special role for solving large scale optimization problems due to the simplicity of their iteration, convergence properties and their low memory requirements. In this work, we propose a new class of conjugate gradient methods which ensures sufficient descent. Moreover, we propose a new search direction with the Wolfe line search technique for solving unconstrained optimization problems, a global convergence result for general functions is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed methods are preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.

Keywords: unconstrained optimization, conjugate gradient method, sufficient descent property, numerical comparisons

Procedia PDF Downloads 387
4416 The Involvement of the Homing Receptors CCR7 and CD62L in the Pathogenesis of Graft-Versus-Host Disease

Authors: Federico Herrera, Valle Gomez García de Soria, Itxaso Portero Sainz, Carlos Fernández Arandojo, Mercedes Royg, Ana Marcos Jimenez, Anna Kreutzman, Cecilia MuñozCalleja

Abstract:

Introduction: Graft-versus-host disease (GVHD) still remains the major complication associated with allogeneic stem cell transplantation (SCT). The pathogenesis involves migration of donor naïve T-cells into recipient secondary lymphoid organs. Two molecules are important in this process: CD62L and CCR7, which are characteristically expressed in naïve/central memory T-cells. With this background, we aimed to study the influence of CCR7 and CD62L on donor lymphocytes in the development and severity of GVHD. Material and methods: This single center study included 98 donor-recipient pairs. Samples were collected prospectively from the apheresis product and phenotyped by flow cytometry. CCR7 and CD62L expression in CD4+ and CD8+ T-cells were compared between patients who developed acute (n=40) or chronic GVHD (n=33) and those who did not (n=38). Results: The patients who developed acute GVHD were transplanted with a higher percentage of CCR7+CD4+ T-cells (p = 0.05) compared to the no GVHD group. These results were confirmed when these patients were divided in degrees according to the severity of the disease; the more severe disease, the higher percentage of CCR7+CD4+ T-cells. Conversely, chronic GVHD patients received a higher percentage of CCR7+CD8+ T-cells (p=0.02) in comparison to those who did not develop the complication. These data were also confirmed when patients were subdivided in degrees of the disease severity. A multivariable analysis confirmed that percentage of CCR7+CD4+ T-cells is a predictive factor of acute GVHD whereas the percentage of CCR7+CD8+ T-cells is a predictive factor of chronic GVHD. In vitro functional assays (migration and activation assays) supported the idea of CCR7+ T-cells were involved in the development of GVHD. As low levels of CD62L expression were detected in all apheresis products, we tested the hypothesis that CD62L was shed during apheresis procedure. Comparing CD62L surface levels in T-cells from the same donor immediately before collecting the apheresis product, and the final apheresis product we found that this process down-regulated CD62L in both CD4+ and CD8+ T cells (p=0.008). Interestingly, when CD62L levels were analysed in days 30 or 60 after engraftment, they recovered to baseline (p=0.008). However, to investigate the relation between CD62L expression and the development of GVHD in the recipient samples after the engraftment, no differences were observed comparing patients with GVHD to those who did not develop the disease. Discussion: Our prospective study indicates that the CCR7+ T-cells from the donor, which include naïve and central memory T-cells, contain the alloreactive cells with a high ability to mediate GVHD (in the case of both migration and activation). Therefore we suggest that the proportion and functional properties of CCR7+CD4+ and CCR7+CD8+ T-cells in the apheresis could act as a predictive biomarker to both acute and chronic GVHD respectively. Importantly, our study precludes that CD62L is lost in the apheresis and therefore it is not a reliable biomarker for the development of GVHD.

Keywords: CCR7, CD62L, GVHD, SCT

Procedia PDF Downloads 273
4415 Knowledge Management in the Interactive Portal for Decision Makers on InKOM Example

Authors: K. Marciniak, M. Owoc

Abstract:

Managers as decision-makers present in different sectors should be supported in efficient and more and more sophisticated way. There are huge number of software tools developed for such users starting from simple registering data from business area – typical for operational level of management – up to intelligent techniques with delivering knowledge - for tactical and strategic levels of management. There is a big challenge for software developers to create intelligent management dashboards allowing to support different decisions. In more advanced solutions there is even an option for selection of intelligent techniques useful for managers in particular decision-making phase in order to deliver valid knowledge-base. Such a tool (called Intelligent Dashboard for SME Managers–InKOM) is prepared in the Business Intelligent framework of Teta products. The aim of the paper is to present solutions assumed for InKOM concerning on management of stored knowledge bases offering for business managers. The paper is managed as follows. After short introduction concerning research context the discussed supporting managers via information systems the InKOM platform is presented. In the crucial part of paper a process of knowledge transformation and validation is demonstrated. We will focus on potential and real ways of knowledge-bases acquiring, storing and validation. It allows for formulation conclusions interesting from knowledge engineering point of view.

Keywords: business intelligence, decision support systems, knowledge management, knowledge transformation, knowledge validation, managerial systems

Procedia PDF Downloads 499
4414 Piql Preservation Services - A Holistic Approach to Digital Long-Term Preservation

Authors: Alexander Rych

Abstract:

Piql Preservation Services (“Piql”) is a turnkey solution designed for secure, migration-free long- term preservation of digital data. Piql sets an open standard for long- term preservation for the future. It consists of equipment and processes needed for writing and retrieving digital data. Exponentially growing amounts of data demand for logistically effective and cost effective processes. Digital storage media (hard disks, magnetic tape) exhibit limited lifetime. Repetitive data migration to overcome rapid obsolescence of hardware and software bears accelerated risk of data loss, data corruption or even manipulation and adds significant repetitive costs for hardware and software investments. Piql stores any kind of data in its digital as well as analog form securely for 500 years. The medium that provides this is a film reel. Using photosensitive film polyester base, a very stable material that is known for its immutability over hundreds of years, secure and cost-effective long- term preservation can be provided. The film reel itself is stored in a packaging capable of protecting the optical storage medium. These components have undergone extensive testing to ensure longevity of up to 500 years. In addition to its durability, film is a true WORM (write once- read many) medium. It therefore is resistant to editing or manipulation. Being able to store any form of data onto the film makes Piql a superior solution for long-term preservation. Paper documents, images, video or audio sequences – all of those file formats and documents can be preserved in its native file structure. In order to restore the encoded digital data, only a film scanner, a digital camera or any appropriate optical reading device will be needed in the future. Every film reel includes an index section describing the data saved on the film. It also contains a content section carrying meta-data, enabling users in the future to rebuild software in order to read and decode the digital information.

Keywords: digital data, long-term preservation, migration-free, photosensitive film

Procedia PDF Downloads 381
4413 Assessment of Rock Masses Performance as a Support of Lined Rock Cavern for Isothermal Compressed Air Energy Storage

Authors: Vathna Suy, Ki-Il Song

Abstract:

In order to store highly pressurized gas such as an isothermal compressed air energy storage, Lined Rock Caverns (LRC) are constructed underground and supported by layers of concrete, steel and rock masses. This study aims to numerically investigate the performance of rock masses which serve as a support of Lined Rock Cavern subjected to high cyclic pressure loadings. FLAC3D finite different software is used for the simulation since the software can effectively model the behavior of concrete lining and steel plate with its built-in structural elements. Cyclic pressure loadings are applied onto the inner surface of the cavern which then transmitted to concrete, steel and eventually to the surrounding rock masses. Changes of stress and strain are constantly monitored throughout all the process of loading operations. The results at various monitoring locations are then extracted and analyzed to assess the response of the rock masses, specifically on its ability to absorb energy during loadings induced by the changes of cyclic pressure loadings inside the cavern. By analyzing the obtained data of stress-strain relation and taking into account the behavior of materials under the effect of strain-dependency, conclusions on the performance of rock masses subjected to high cyclic loading conditions are drawn.

Keywords: cyclic loading, FLAC3D, lined rock cavern (LRC), strain-dependency

Procedia PDF Downloads 233
4412 Visitor Discourses of European Holocaust Heritage: A Netnography

Authors: Craig Wight

Abstract:

This presentation will identify the key findings from a recent netnographic discourse analysis of social media content generated in response to visits to three iconic European Holocaust Heritage sites: Ann Frank’s House in Amsterdam, the Netherlands, the Auschwutz-Birkenau Memorial Museum and Memorial in Poland, and the Jewish Museum in Berlin, Germany. Four major discourses are identified under the headings of Holocaust heritage as social memory, reactions to Holocaust heritage, obligation and ritual, and transgressive visitor behaviour. Together, these discourses frame the values, existential anxieties, emotions, priorities, and expectations of visitors. The findings will interest those involved in the planning and management of Holocaust heritage for tourism purposes since they provide unique access to an archive of unmediated visitor feedback on European Holocaust heritage experiences.

Keywords: foucault, european holocaust heritage, discourse analysis, netnography, social media, dark tourism

Procedia PDF Downloads 125
4411 Emotion Regulation and Executive Functioning Scale for Children and Adolescents (REMEX): Scale Development

Authors: Cristina Costescu, Carmen David, Adrian Roșan

Abstract:

Executive functions (EF) and emotion regulation strategies are processes that allow individuals to function in an adaptative way and to be goal-oriented, which is essential for success in daily living activities, at school, or in social contexts. The Emotion Regulation and Executive Functioning Scale for Children and Adolescents (REMEX) represents an empirically based tool (based on the model of EF developed by Diamond) for evaluating significant dimensions of child and adolescent EFs and emotion regulation strategies, mainly in school contexts. The instrument measures the following dimensions: working memory, inhibition, cognitive flexibility, executive attention, planning, emotional control, and emotion regulation strategies. Building the instrument involved not only a top-down process, as we selected the content in accordance with prominent models of FE, but also a bottom-up one, as we were able to identify valid contexts in which FE and ER are put to use. For the construction of the instrument, we implemented three focus groups with teachers and other professionals since the aim was to develop an accurate, objective, and ecological instrument. We used the focus group method in order to address each dimension and to yield a bank of items to be further tested. Each dimension is addressed through a task that the examiner will apply and through several items derived from the main task. For the validation of the instrument, we plan to use item response theory (IRT), also known as the latent response theory, that attempts to explain the relationship between latent traits (unobservable cognitive processes) and their manifestations (i.e., observed outcomes, responses, or performance). REMEX represents an ecological scale that integrates a current scientific understanding of emotion regulation and EF and is directly applicable to school contexts, and it can be very useful for developing intervention protocols. We plan to test his convergent validity with the Childhood Executive Functioning Inventory (CHEXI) and Emotion Dysregulation Inventory (EDI) and divergent validity between a group of typically developing children and children with neurodevelopmental disorders, aged between 6 and 9 years old. In a previous pilot study, we enrolled a sample of 40 children with autism spectrum disorders and attention-deficit/hyperactivity disorder aged 6 to 12 years old, and we applied the above-mentioned scales (CHEXI and EDI). Our results showed that deficits in planning, bebavior regulation, inhibition, and working memory predict high levels of emotional reactivity, leading to emotional and behavioural problems. Considering previous results, we expect our findings to provide support for the validity and reliability of the REMEX version as an ecological instrument for assessing emotion regulation and EF in children and for key features of its uses in intervention protocols.

Keywords: executive functions, emotion regulation, children, item response theory, focus group

Procedia PDF Downloads 87
4410 Effect of Exercise and Mindfulness on Cognitive and Psycho-Emotional Functioning in Children with ADHD

Authors: Hannah Bigelow, Marcus D. Gottlieb, Michelle Ogrodnik, Jeffrey, D. Graham, Barbara Fenesi

Abstract:

Attention Deficit Hyperactivity Disorder (ADHD) is one of the most common neurodevelopmental disorders affecting approximately 6% of children worldwide. ADHD is characterized by a combination of persistent deficits including impaired inhibitory control, working memory and task-switching. Many children with ADHD also have comorbid mental health issues such as anxiety and depression. There are several treatment options to manage ADHD impairments, including drug and behavioural management therapy, but they all have drawbacks, such as worsening mood disturbances or being inaccessible to certain demographics. Both physical exercise and mindfulness meditation serve as alternative options to potentially help mitigate ADHD symptoms. Although there is extensive support for the benefits of long-term physical exercise or mindfulness meditation programs, there is insufficient research investigating how acute bouts (i.e., single, short bouts) can help children with ADHD. Thus, the current study aimed to understand how single, short bouts of exercise and mindfulness meditation impacts executive functioning and psycho-emotional well-being in children with ADHD, as well as to directly compare the efficacy of these two interventions. The study used a a pre- post-test, within-subjects design to assess the effects of a 10-minute bout of moderate intensity exercise versus a 10-minute bout of mindfulness meditation (versus 10 minutes of a reading control) on the executive functioning and psycho-emotional well-being of 16 children and youth with ADHD aged 10-14 (male=11; White=80%). Participants completed all three interventions: 10 minutes of exercise, 10 minutes of mindfulness meditation, and 10 minutes of reading (control). Executive functioning (inhibitory control, working memory, task-switching) and psycho-emotional well-being (mood, self-efficacy) were assessed before and after each intervention. Mindfulness meditation promoted executive functioning, while exercise enhanced positive mood and self-efficacy. Critically, this work demonstrates that a single, short bout of mindfulness meditation session can promote inhibitory control among children with ADHD. This is especially important for children with ADHD as inhibitory control deficits are among the most pervasive challenges that they face. Furthermore, the current study provides preliminary evidence for the benefit of acute exercise for promoting positive mood and general self-efficacy for children and youth with ADHD. These results may increase the accessibility of acute exercise for children with ADHD, providing guardians and teachers a feasible option to incorporate just 10 minutes of exercise to assist children emotionally. In summary, this research supports the use of acute exercise and mindfulness meditation on varying aspects of executive functioning and psycho-emotional well-being in children and youth with ADHD. This work offers important insight into how behavioural interventions could be personalized according to a child’s needs.

Keywords: attention-deficit hyperactivity disorder (ADHD), acute exercise, mindfulness meditation, executive functioning, psycho-emotional well-being

Procedia PDF Downloads 115
4409 Evaluation of the Factors Affecting Violence Against Women (Case Study: Couples Referring to Family Counseling Centers in Tehran)

Authors: Hassan Manouchehri

Abstract:

The present study aimed to identify and evaluate the factors affecting violence against women. The statistical population included all couples referring to family counseling centers in Tehran due to domestic violence during the past year. A number of 305 people were selected as a statistical sample using simple random sampling and Cochran's formula in unlimited conditions. A researcher-made questionnaire including 110 items was used for data collection. The face validity and content validity of the questionnaire were confirmed by 30 experts and its reliability was obtained above 0.7 for all studied variables in a preliminary test with 30 subjects and it was acceptable. In order to analyze the data, descriptive statistical methods were used with SPSS software version 22 and inferential statistics were used for modeling structural equations in Smart PLS software version 2. Evaluating the theoretical framework and domestic and foreign studies indicated that, in general, four main factors, including cultural and social factors, economic factors, legal factors, as well as medical factors, underlie violence against women. In addition, structural equation modeling findings indicated that cultural and social factors, economic factors, legal factors, and medical factors affect violence against women.

Keywords: violence against women, cultural and social factors, economic factors, legal factors, medical factors

Procedia PDF Downloads 123
4408 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 511