Search results for: data combining
23916 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects
Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour
Abstract:
One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.Keywords: engineering geology, rock mass classification, rock mechanic, tunnel
Procedia PDF Downloads 7923915 Self-Assembly of TaC@Ta Core-Shell-Like Nanocomposite Film via Solid-State Dewetting: Toward Superior Wear and Corrosion Resistance
Authors: Ping Ren, Mao Wen, Kan Zhang, Weitao Zheng
Abstract:
The improvement of comprehensive properties including hardness, toughness, wear, and corrosion resistance in the transition metal carbides/nitrides TMCN films, especially avoiding the trade-off between hardness and toughness, is strongly required to adapt to various applications. Although incorporating ductile metal DM phase into the TMCN via thermally-induced phase separation has been emerged as an effective approach to toughen TMCN-based films, the DM is just limited to some soft ductile metal (i.e. Cu, Ag, Au immiscibility with the TMCN. Moreover, hardness is highly sensitive to soft DM content and can be significantly worsened. Hence, a novel preparation method should be attempted to broaden the DM selection and assemble much more ordered nanocomposite structure for improving the comprehensive properties. Here, we provide a new strategy, by activating solid-state dewetting during layered deposition, to accomplish the self-assembly of ordered TaC@Ta core-shell-like nanocomposite film consisting of TaC nanocrystalline encapsulated with thin pseudocrystal Ta tissue. That results in the superhard (~45.1 GPa) dominated by Orowan strengthening mechanism and high toughness attributed to indenter-induced phase transformation from the pseudocrystal to body-centered cubic Ta, together with the drastically enhanced wear and corrosion resistance. Furthermore, very thin pseudocrystal Ta encapsulated layer (~1.5 nm) in the TaC@Ta core-shell-like structure helps for promoting the formation of lubricious TaOₓ Magnéli phase during sliding, thereby further dropping the coefficient of friction. Apparently, solid-state dewetting may provide a new route to construct ordered TMC(N)@TM core-shell-like nanocomposite capable of combining superhard, high toughness, low friction, superior wear with corrosion resistance.Keywords: corrosion, nanocomposite film, solid-state dewetting, tribology
Procedia PDF Downloads 13423914 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data
Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini
Abstract:
A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.Keywords: central Italy, extreme events, rainfall data, underestimation errors
Procedia PDF Downloads 18923913 The Effect of the Combination of Methotrexate Nanoparticles and TiO2 on Breast Cancer
Authors: Nusaiba Al-Nemrawi, Belal Al-Husein
Abstract:
Methotrexate (MTX) is a stoichiometric inhibitor of dihydrofolate reductase, which is essential for DNA synthesis. MTX is a chemotherapeutic agent used for treating many types of cancer cells. However, cells’ resistant to MTX is very common and its pharmacokinetic behavior is highly problematic. of MTX within tumor cells, we propose encapsulation of antitumor drugs in nanoparticulated systems. Chitosan (CS) is a naturally occurring polymer that is biocompatibe, biodegradable, non-toxic, cationic and bioadhesive. CS nanoparticles (CS-NPs) have been used as drug carrier for targeted delivery. Titanium dioxide (TiO2), a natural mineral oxide, which is used in biomaterials due to its high stability and antimicrobial and anticorrosive properties. TiO2 showed a potential as a tumor suppressor. In this study a new formulation of MTX loaded in CS NPs (CS-MTX NPs) and coated with Titanium oxide (TiO2) was prepared. The mean particle size, zeta potential, polydispersity index were measured. The interaction between CS NPs and TiO2 NPs was confirmed using FTIR and XRD. CS-MTX NPs was studied in vitro using the tumor cell line MCF-7 (human breast cancer). The results showed that CS-MTX has a size around 169 nm and as they were coated with TiO2, the size ranged between and depending on the ratio of CS-MTX to TiO2 ratio used in the preparation. All NPs (uncoated and coated carried positive charges and were monodispersed. The entrapment efficacy was around 65%. Both FTIR and XRD proved that TiO2 interacted with CS-MTX NPs. The drug invitro release was controlled and sustained over days. Finally, the studied in vitro using the tumor cell line MCF-7 suggested that combining nanomaterials with anticancer drugs CS-MTX NPs may be more effective than free MTX for cancer treatment. In conclusion, the combination of CS-MTX NPs and TiO2 NPs showed excellent time-dependent in vitro antitumor behavior, therefore, can be employed as a promising anticancer agent to attain efficient results towards MCF-7 cells.Keywords: Methotrexate, Titanium dioxide, Chitosan nanoparticles, cancer
Procedia PDF Downloads 9323912 Objective Evaluation on Medical Image Compression Using Wavelet Transformation
Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah
Abstract:
The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation
Procedia PDF Downloads 28423911 Eli-Twist Spun Yarn: An Alternative to Conventional Sewing Thread
Authors: Sujit Kumar Sinha, Madan Lal Regar
Abstract:
Sewing thread plays an important role in the transformation of a two-dimensional fabric into a three-dimensional garment. The interaction of the sewing thread with the fabric at the seam not only influences the appearance of a garment but also its performance. Careful selection of sewing thread and associated parameters can only help in improvement. Over the years, ring spinning has been dominating the yarn market. In the pursuit of improvement to challenge its dominance alternative technology has also been developed. But no real challenge has been posed by the any of the developed spinning systems. Eli-Twist spinning system can be a new method of yarn manufacture to provide a product with improved mechanical and physical properties with respect to the conventional ring spun yarn. The system, patented by Suessen has gained considerable attention in the recent times. The process of produces a two-ply compact yarn with improved fiber utilization. It produces a novel structure combining all advantages of condensing and doubling. In the present study, sewing threads of three different counts each from cotton, polyester and polyester/cotton (50/50) blend were produced on a ring and Eli-Twist systems. A twist multiplier of 4.2 was used to produce all the yarns. A comparison of hairiness, tensile strength and coefficient of friction with conventional ring yarn was made. Eli-Twist yarn has shown better frictional characteristics, better tensile strength and less hairiness. The performance of the Eli-Twist sewing thread has also been found to be better than the conventional 2-ply sewing thread. The performance was estimated through seam strength, seam elongation and seam efficiency of sewn fabric. Eli-Twist sewing thread has shown less friction, less hairiness, and higher tensile strength. Eli-Twist sewing thread resulted in better seam characteristics in comparison to conventional 2-ply sewing thread.Keywords: ring spun yarn, Eli-Twist yarn, sewing thread, seam strength, seam elongation, seam efficiency
Procedia PDF Downloads 19623910 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction
Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun
Abstract:
The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.Keywords: usability, qualitative data, text-processing algorithm, natural language processing
Procedia PDF Downloads 28323909 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan
Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa
Abstract:
Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types
Procedia PDF Downloads 18423908 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones
Authors: Mohamed Abdelkareem
Abstract:
Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.Keywords: GIS, remote sensing, groundwater, Egypt
Procedia PDF Downloads 9623907 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33223906 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator
Authors: Gabrielle Holt
Abstract:
This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett
Procedia PDF Downloads 9123905 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals
Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh
Abstract:
Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly
Procedia PDF Downloads 5123904 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use
Authors: Isaura Esther Solano Núñez, David Suarez
Abstract:
The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.Keywords: malnutrition, data mining, analytical, descriptive, population, Wayuu, indigenous
Procedia PDF Downloads 15923903 Application of the Mobile Phone for Occupational Self-Inspection Program in Small-Scale Industries
Authors: Jia-Sin Li, Ying-Fang Wang, Cheing-Tong Yan
Abstract:
In this study, an integrated approach of Google Spreadsheet and QR code which is free internet resources was used to improve the inspection procedure. The mobile phone Application(App)was also designed to combine with a web page to create an automatic checklist in order to provide a new integrated information of inspection management system. By means of client-server model, the client App is developed for Android mobile OS and the back end is a web server. It can set up App accounts including authorized data and store some checklist documents in the website. The checklist document URL could generate QR code first and then print and paste on the machine. The user can scan the QR code by the app and filled the checklist in the factory. In the meanwhile, the checklist data will send to the server, it not only save the filled data but also executes the related functions and charts. On the other hand, it also enables auditors and supervisors to facilitate the prevention and response to hazards, as well as immediate report data checks. Finally, statistics and professional analysis are performed using inspection records and other relevant data to not only improve the reliability, integrity of inspection operations and equipment loss control, but also increase plant safety and personnel performance. Therefore, it suggested that the traditional paper-based inspection method could be replaced by the APP which promotes the promotion of industrial security and reduces human error.Keywords: checklist, Google spreadsheet, APP, self-inspection
Procedia PDF Downloads 11723902 Project Management and International Development: Competencies for International Assignment
Authors: M. P. Leroux, C. Coulombe
Abstract:
Projects are popular vehicles through which international aid is delivered in developing countries. To achieve their objectives, many northern organizations develop projects with local partner organizations in the developing countries through technical assistance projects. International aid and international development projects precisely have long been criticized for poor results although billions are spent every year. Little empirical research in the field of project management has the focus on knowledge transfer in international development context. This paper focuses particularly on personal dimensions of international assignees participating in project within local team members in the host country. We propose to explore the possible links with a human resource management perspective in order to shed light on the less research problematic of knowledge transfer in development cooperation projects. The process leading to capacity building being far complex, involving multiple dimensions and far from being linear, we propose here to assess if traditional research on expatriate in multinational corporations pertain to the field of project management in developing countries. The following question is addressed: in the context of international development project cooperation, what personal determinants should the selection process focus when looking to fill a technical assistance position in a developing country? To answer that question, we first reviewed the literature on expatriate in the context of inter organizational knowledge transfer. Second, we proposed a theoretical framework combining perspectives of development studies and management to explore if parallels can be draw between traditional international assignment and technical assistance project assignment in developing countries. We conducted an exploratory study using case studies from technical assistance initiatives led in Haiti, a country in Central America. Data were collected from multiple sources following qualitative study research methods. Direct observations in the field were allowed by local leaders of six organization; individual interviews with present and past international assignees, individual interview with local team members, and focus groups were organized in order to triangulate information collected. Contrary from empirical research on knowledge transfer in multinational corporations, results tend to show that technical expertise rank well behind many others characteristics. Results tend to show the importance of soft skills, as a prerequisite to succeed in projects where local team have to collaborate. More importantly, international assignees who were talking knowledge sharing instead of knowledge transfer seemed to feel more satisfied at the end of their mandate than the others. Reciprocally, local team members who perceived to have participated in a project with an expat looking to share instead of aiming to transfer knowledge seemed to describe the results of project in more positive terms than the others. Results obtained from this exploratory study open the way for a promising research agenda in the field of project management. It emphasises the urgent need to achieve a better understanding on the complex set of soft skills project managers or project chiefs would benefit to develop, in particular, the ability to absorb knowledge and the willingness to share one’s knowledge.Keywords: international assignee, international project cooperation, knowledge transfer, soft skills
Procedia PDF Downloads 14123901 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 1723900 Rheological Properties of Polymer Systems in Magnetic Field
Authors: T. S. Soliman, A. G. Galyas, E. V. Rusinova, S. A. Vshivkov
Abstract:
The liquid crystals combining properties of a liquid and an anisotropic crystal substance play an important role in a science and engineering. Molecules of cellulose and its derivatives have rigid helical conformation, stabilized by intramolecular hydrogen bonds. Therefore the macromolecules of these polymers are capable to be ordered at dissolution and form liquid crystals of cholesteric type. Phase diagrams of solutions of some cellulose derivatives are known. However, little is known about the effect of a magnetic field on the viscosity of polymer solutions. The systems hydroxypropyl cellulose (HPC) – ethanol, HPC – ethylene glycol, HPC–DМАA, HPC–DMF, ethyl cellulose (EC)–ethanol, EC–DMF, were studied in the presence and absence of magnetic field. The solution viscosity was determined on a Rheotest RN 4.1 rheometer. The effect of a magnetic field on the solution properties was studied with the use of two magnets, which induces a magnetic-field-lines directed perpendicularly and parallel to the rotational axis of a rotor. Application of the magnetic field is shown to be accompanied by an increase in the additional assembly of macromolecules, as is evident from a gain in the radii of light scattering particles. In the presence of a magnetic field, the long chains of macromolecules are oriented in parallel with field lines. Such an orientation is associated with the molecular diamagnetic anisotropy of macromolecules. As a result, supramolecular particles are formed, especially in the vicinity of the region of liquid crystalline phase transition. The magnetic field leads to the increase in viscosity of solutions. The results were used to plot the concentration dependence of η/η0, where η and η0 are the viscosities of solutions in the presence and absence of a magnetic field, respectively. In this case, the values of viscosity corresponding to low shear rates were chosen because the concentration dependence of viscosity at low shear rates is typical for anisotropic systems. In the investigated composition range, the values of η/η0 are described by a curve with a maximum.Keywords: rheology, liquid crystals, magnetic field, cellulose ethers
Procedia PDF Downloads 34723899 Industry 4.0 and Supply Chain Integration: Case of Tunisian Industrial Companies
Authors: Rym Ghariani, Ghada Soltane, Younes Boujelbene
Abstract:
Industry 4.0, a set of emerging smart and digital technologies, has been the main focus of operations management researchers and practitioners in recent years. The objective of this research paper is to study the impact of Industry 4.0 on the integration of the supply chain (SCI) in Tunisian industrial companies. A conceptual model to study the relationship between Industry 4.0 technologies and supply chain integration was designed. This model contains three explained variables (Big data, Internet of Things, and Robotics) and one variable to be explained (supply chain integration). In order to answer our research questions and investigate the research hypotheses, principal component analysis and discriminant analysis were used using SPSS26 software. The results reveal that there is a statistically positive impact significant impact of Industry 4.0 (Big data, Internet of Things and Robotics) on the integration of the supply chain. Interestingly, big data has a greater positive impact on supply chain integration than the Internet of Things and robotics.Keywords: industry 4.0 (I4.0), big data, internet of things, robotics, supply chain integration
Procedia PDF Downloads 5723898 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context
Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue
Abstract:
The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.Keywords: data analytics, smart cities, competitive advantage, internet of things
Procedia PDF Downloads 13323897 Best Season for Seismic Survey in Zaria Area, Nigeria: Data Quality and Implications
Authors: Ibe O. Stephen, Egwuonwu N. Gabriel
Abstract:
Variations in seismic P-wave velocity and depth resolution resulting from variations in subsurface water saturation were investigated in this study in order to determine the season of the year that gives the most reliable P-wave velocity and depth resolution of the subsurface in Zaria Area, Nigeria. A 2D seismic refraction tomography technique involving an ABEM Terraloc MK6 Seismograph was used to collect data across a borehole of standard log with the centre of the spread situated at the borehole site. Using the same parameters this procedure was repeated along the same spread for at least once in a month for at least eight months in a year for four years. The choice for each survey time depended on when there was significant variation in rainfall data. The seismic data collected were tomographically inverted. The results suggested that the average P-wave velocity ranges of the subsurface in the area are generally higher when the ground was wet than when it was dry. The results also suggested that the overburden of about 9.0 m in thickness, the weathered basement of about 14.0 m in thickness and the fractured basement at a depth of about 23.0 m best fitted the borehole log. This best fit was consistently obtained in the months between March and May when the average total rainfall was about 44.8 mm in the area. The results had also shown that the velocity ranges in both dry and wet formations fall within the standard ranges as provided in literature. In terms of velocity, this study has not in any way clearly distinguished the quality of the results of the seismic data obtained when the subsurface was dry from the results of the data collected when the subsurface was wet. It was concluded that for more detailed and reliable seismic studies in Zaria Area and its environs with similar climatic condition, the surveys are best conducted between March and May. The most reliable seismic data for depth resolution are most likely obtainable in the area between March and May.Keywords: best season, variations in depth resolution, variations in P-wave velocity, variations in subsurface water saturation, Zaria area
Procedia PDF Downloads 28723896 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices
Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues
Abstract:
This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT
Procedia PDF Downloads 14823895 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology
Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner
Abstract:
Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.Keywords: health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics
Procedia PDF Downloads 34623894 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19323893 Mobile-Assisted Language Learning (MALL) Applications for Interactive and Engaging Classrooms: APPsolutely!
Authors: Ajda Osifo, Amanda Radwan
Abstract:
Mobile-assisted language learning (MALL) or m-learning which is defined as learning with mobile devices that can be utilized in any place that is equipped with unbroken transmission signals, has created new opportunities and challenges for educational use. It introduced a new learning model combining new types of mobile devices, wireless communication services and technologies with teaching and learning. Recent advancements in the mobile world such as the Apple IOS devices (IPhone, IPod Touch and IPad), Android devices and other smartphone devices and environments (such as Windows Phone 7 and Blackberry), allowed learning to be more flexible inside and outside the classroom, making the learning experience unique, adaptable and tailored to each user. Creativity, learner autonomy, collaboration and digital practices of language learners are encouraged as well as innovative pedagogical applications, like the flipped classroom, for such practices in classroom contexts are enhanced. These developments are gradually embedded in daily life and they also seem to be heralding the sustainable move to paperless classrooms. Since mobile technologies are increasingly viewed as a main platform for delivery, we as educators need to design our activities, materials and learning environments in such a way to ensure that learners are engaged and feel comfortable. For the purposes of our session, several core MALL applications that work on the Apple IPad/IPhone will be explored; the rationale and steps needed to successfully implement these applications will be discussed and student examples will be showcased. The focus of the session will be on the following points: 1-Our current pedagogical approach, 2-The rationale and several core MALL apps, 3-Possible Challenges for Teachers and Learners, 4-Future implications. This session is aimed at instructors who are interested in integrating MALL apps into their own classroom planning.Keywords: MALL, educational technology, iPads, apps
Procedia PDF Downloads 39323892 Development of Hybrid Materials Combining Biomass as Fique Fibers with Metal-Organic Frameworks, and Their Potential as Mercury Adsorbents
Authors: Karen G. Bastidas Gomez, Hugo R. Zea Ramirez, Manuel F. Ribeiro Pereira, Cesar A. Sierra Avila, Juan A. Clavijo Morales
Abstract:
The contamination of water sources with heavy metals such as mercury has been an environmental problem; it has generated a high impact on the environment and human health. In countries such as Colombia, mercury contamination due to mining has reached levels much higher than the world average. This work proposes the use of fique fibers as adsorbent in mercury removal. The evaluation of the material was carried out under five different conditions (raw, pretreated by organosolv, functionalized by TEMPO oxidation, fiber functionalized plus MOF-199 and fiber functionalized plus MOF-199-SH). All the materials were characterized using FTIR, SEM, EDX, XRD, and TGA. Regarding the mercury removal, it was done under room pressure and temperature, also pH = 7 for all materials presentations, followed by Atomic Absorption Spectroscopy. The high cellulose content in fique is the main particularity of this lignocellulosic biomass since the degree of oxidation depends on the number of hydroxyl groups on the surface capable of oxidizing into carboxylic acids, a functional group capable of increasing ion exchange with mercury in solution. It was also expected that the impregnation of the MOF would increase the mercury removal; however, it was found that the functionalized fique achieved a greater percentage of removal, resulting in 81.33% of removal, 44% for the fique with the MOF-199 and 72% for the MOF-199-SH with. The pretreated fiber and raw also showed 74% and 56%, respectively, which indicates that fique does not require considerable modifications in its structure to achieve good performances. Even so, the functionalized fiber increases the percentage of removal considerably compared to the pretreated fique, which suggests that the functionalization process is a feasible procedure to apply with the purpose of improving the removal percentage. In addition, this is a procedure that follows a green approach since the reagents involved have low environmental impact, and the contribution to the remediation of natural resources is high.Keywords: biomass, nanotechnology, science materials, wastewater treatment
Procedia PDF Downloads 11623891 A Human Centered Design of an Exoskeleton Using Multibody Simulation
Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann
Abstract:
Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation
Procedia PDF Downloads 16023890 Evaluation of the Nursing Management Course in Undergraduate Nursing Programs of State Universities in Turkey
Authors: Oznur Ispir, Oya Celebi Cakiroglu, Esengul Elibol, Emine Ceribas, Gizem Acikgoz, Hande Yesilbas, Merve Tarhan
Abstract:
This study was conducted to evaluate the academic staff teaching the 'Nursing Management' course in the undergraduate nursing programs of the state universities in Turkey and to assess the current content of the course. Design of the study is descriptive. Population of the study consists of seventy-eight undergraduate nursing programs in the state universities in Turkey. The questionnaire/survey prepared by the researchers was used as a data collection tool. The data were obtained by screening the content of the websites of nursing education programs between March and May 2016. Descriptive statistics were used to analyze the data. The research performed within the study indicated that 58% of the undergraduate nursing programs from which the data were derived were included in the school of health, 81% of the academic staff graduated from the undergraduate nursing programs, 40% worked as a lecturer and 37% specialized in a field other than the nursing. The research also implied that the above-mentioned course was included in 98% of the programs from which it was possible to obtain data. The full name of the course was 'Nursing Management' in 95% of the programs and 98% stated that the course was compulsory. Theory and application hours were 3.13 and 2.91, respectively. Moreover, the content of the course was not shared in 65% of the programs reviewed. This study demonstrated that the experience and expertise of the academic staff teaching the 'Nursing Management' course was not sufficient in the management area, and the schedule and content of the course were not sufficient although many nursing education programs provided the course. Comparison between the curricula of the course revealed significant differences.Keywords: nursing, nursing management, nursing management course, undergraduate program
Procedia PDF Downloads 35723889 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 28323888 Q-Map: Clinical Concept Mining from Clinical Documents
Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala
Abstract:
Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics
Procedia PDF Downloads 13323887 Developing Logistics Indices for Turkey as an an Indicator of Economic Activity
Authors: Gizem İntepe, Eti Mizrahi
Abstract:
Investment and financing decisions are influenced by various economic features. Detailed analysis should be conducted in order to make decisions not only by companies but also by governments. Such analysis can be conducted either at the company level or on a sectoral basis to reduce risks and to maximize profits. Sectoral disaggregation caused by seasonality effects, subventions, data advantages or disadvantages may appear in sectors behaving parallel to BIST (Borsa Istanbul stock exchange) Index. Proposed logistic indices could serve market needs as a decision parameter in sectoral basis and also helps forecasting activities in import export volume changes. Also it is an indicator of logistic activity, which is also a sign of economic mobility at the national level. Publicly available data from “Ministry of Transport, Maritime Affairs and Communications” and “Turkish Statistical Institute” is utilized to obtain five logistics indices namely as; exLogistic, imLogistic, fLogistic, dLogistic and cLogistic index. Then, efficiency and reliability of these indices are tested.Keywords: economic activity, export trade data, import trade data, logistics indices
Procedia PDF Downloads 335