Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28321

Search results for: raw complex data

26881 Evaluation of Cellulase and Xylanase Production by Micrococcus Sp. Isolated from Decaying Lignocellulosic Biomass Obtained from Alice Environment in the Eastern Cape of South Africa

Authors: Z. Mmango, U. Nwodo, L. V. Mabinya, A. I. Okoh

Abstract:

Cellulose and hemicellulose account for a large portion of the world‘s plant biomass. In nature, these polysaccharides are intertwined forming complex materials that requires multiple and expensive treatment processes to free up the raw materials trapped in the matrix. Enzymatic degradation remains as the preferred technique as it is inexpensive and eco-friendly. However, the insufficiencies of enzyme battery systems in the degradation of lignocellulosic complex motivate the search for effective degrading enzymes from bacterial isolates from uncommon environment. The study aimed at the evaluation of actinomycetes isolated from saw dust samples collected from wood factory under bed. Cellulase and xylanase production was screened through organism culture on carboxyl methyl cellulose agar and Birchwood xylan. Halo zone indicating lignocellose utilization was shown by an isolate identified through 16S rRNA gene as Micrococcus luteus. The optimum condition for the production of cellulase and xylanase were incubation temperature of 25 °C, fermentation medium pH 5 and 10, agitation speed of 50 and 200 (rpm) and fermentation incubation time of 96 and 84 (h) respectively. The high cellulose and xylanase activity obtained from this isolate portends industrial relevance.

Keywords: carboxyl methyl cellulose, birchwood xylan, optimization, cellulase, xylanase, micrococcus, DNS method

Procedia PDF Downloads 347
26880 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 250
26879 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 70
26878 Chaos Analysis of a 3D Finance System and Generalized Synchronization for N-Dimension

Authors: Muhammad Fiaz

Abstract:

The article in hand is the study of complex features like Zero Hopf Bifurcation, Chaos and Synchronization of integer and fractional order version of a new 3D finance system. Trusted tools of averaging theory and active control method are utilized for investigation of Zero Hopf bifurcation and synchronization for both versions respectively. Inventiveness of the paper is to find the answer of a question that is it possible to find a chaotic system which can be synchronized with any other of the same dimension? Based on different examples we categorically develop a theory that if a couple of master and slave chaotic dynamical system is synchronized by selecting a suitable gain matrix with special conditions then the master system is synchronized with any chaotic dynamical system of the same dimension. With the help of this study we developed generalized theorems for synchronization of n-dimension dynamical systems for integer as well as fractional versions. it proposed that this investigation will contribute a lot to control dynamical systems and only a suitable gain matrix with special conditions is enough to synchronize the system under consideration with any other chaotic system of the same dimension. Chaotic properties of fractional version of the new finance system are also analyzed at fractional order q=0.87. Simulations results, where required, also provided for authenticity of analytical study.

Keywords: complex analysis, chaos, generalized synchronization, control dynamics, fractional order analysis

Procedia PDF Downloads 60
26877 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 156
26876 Application of RayMan Model in Quantifying the Impacts of the Built Environment and Surface Properties on Surrounding Temperature

Authors: Maryam Karimi, Rouzbeh Nazari

Abstract:

Introduction: Understanding thermal distribution in the micro-urban climate has now been necessary for urban planners or designers due to the impact of complex micro-scale features of Urban Heat Island (UHI) on the built environment and public health. Hence, understanding the interrelation between urban components and thermal pattern can assist planners in the proper addition of vegetation to build-environment, which can minimize the UHI impact. To characterize the need for urban green infrastructure (UGI) through better urban planning, this study proposes the use of RayMan model to measure the impact of air quality and increased temperature based on urban morphology in the selected metropolitan cities. This project will measure the impact of build environment for urban and regional planning using human biometeorological evaluations (Tmrt). Methods: We utilized the RayMan model to estimate the Tmrt in an urban environment incorporating location and height of buildings and trees as a supplemental tool in urban planning and street design. The estimated Tmrt value will be compared with existing surface and air temperature data to find the actual temperature felt by pedestrians. Results: Our current results suggest a strong relationship between sky-view factor (SVF) and increased surface temperature in megacities based on current urban morphology. Conclusion: This study will help with Quantifying the impacts of the built environment and surface properties on surrounding temperature, identifying priority urban neighborhoods by analyzing Tmrt and air quality data at the pedestrian level, and characterizing the need for urban green infrastructure cooling potential.

Keywords: built environment, urban planning, urban cooling, extreme heat

Procedia PDF Downloads 117
26875 Scenario Based Reaction Time Analysis for Seafarers

Authors: Umut Tac, Leyla Tavacioglu, Pelin Bolat

Abstract:

Human factor has been one of the elements that cause vulnerabilities which can be resulted with accidents in maritime transportation. When the roots of human factor based accidents are analyzed, gaps in performing cognitive abilities (reaction time, attention, memory…) are faced as the main reasons for the vulnerabilities in complex environment of maritime systems. Thus cognitive processes in maritime systems have arisen important subject that should be investigated comprehensively. At this point, neurocognitive tests such as reaction time analysis tests have been used as coherent tools that enable us to make valid assessments for cognitive status. In this respect, the aim of this study is to evaluate the reaction time (response time or latency) of seafarers due to their occupational experience and age. For this study, reaction time for different maneuverers has been taken while the participants were performing a sea voyage through a simulator which was run up with a certain scenario. After collecting the data for reaction time, a statistical analyze has been done to understand the relation between occupational experience and cognitive abilities.

Keywords: cognitive abilities, human factor, neurocognitive test battery, reaction time

Procedia PDF Downloads 297
26874 Grisotti Flap as Treatment for Central Tumors of the Breast

Authors: R. Pardo, P. Menendez, MA Gil-Olarte, S. Sanchez, E. García, R. Quintana, J. Martín

Abstract:

Introduction : Within oncoplastic breast techniques there is increased interest in immediate partial breast reconstruction. The volume resected is greater than that of conventional conservative techniques. Central tumours of the breast have classically been treated with a mastectomy with regard to oncological safety and cosmetic secondary effects after wide central resection of the nipple and breast tissue beneath. Oncological results for central quadrantectomy have a recurrence level, disease- free period and survival identical to mastectomy. Grissoti flap is an oncoplastic surgical technique that allows the surgeon to perform a safe central quadrantectomy with excellent cosmetic results. Material and methods: The Grissoti flap is a glandular cutaneous advancement rotation flap that can fill the defect in the central portion of the excised breast. If the inferior border is affected by tumour and further surgery is decided upon at the Multidisciplinary Team Meeting, it will be necessary to perform a mastectomy. All patients with a Grisotti flap undergoing surgery since 2009 were reviewed obtaining the following data: age, hystopathological diagnosis, size, operating time, volume of tissue resected, postoperative admission time, re-excisions due to positive margins affected by tumour, wound dehiscence, complications and recurrence. Analysis and results of sentinel node biopsy were also obtained. Results: 12 patients underwent surgery between 2009-2015. The mean age was 54 years (34-67) . All had a preoperative diagnosis of ductal infiltrative carcinoma of less than 2 cm,. Diagnosis was made with Ultrasound, Mamography or both . Magnetic resonance was used in 5 cases. No patients had preoperative positive axilla after ultrasound exploration. Mean operating time was 104 minutes (84-130). Postoperative stay was 24 hours. Mean volume resected was 159 cc (70-286). In one patient the surgical border was affected by tumour and a further procedure with resection of the affected border was performed as ambulatory surgery. The sentinel node biopsy was positive for micrometastasis in only two cases. In one case lymphadenectomy was performed in 2009. In the other, treated in 2015, no lymphadenectomy was performed as the patient had a favourable histopathological prognosis and the multidisciplinary team meeting agreed that lymphadenectomy was not required. No recurrence has been diagnosed in any of the patients who underwent surgery and they are all disease free at present. Conclusions: Conservative surgery for retroareolar central tumours of the breast results in good local control of the disease with free surgical borders, including resection of the nipple areola complex and pectoral major muscle fascia. Reconstructive surgery with the inferior Grissoti flap adequately fills the defect after central quadrantectomy with creation of a new cutaneous disc where a new nipple areola complex is reconstructed with a local flap or micropigmentation. This avoids the need for contralateral symmetrization. Sentinel Node biopsy can be performed without added morbidity. When feasible, the Grissoti flap will avoid skin-sparing mastectomy for central breast tumours that will require the use of an expander, prosthesis or myocutaneous flap, with all the complications of a more complex operation.

Keywords: Grisotti flap, oncoplastic surgery, central tumours, breast

Procedia PDF Downloads 335
26873 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.

Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate

Procedia PDF Downloads 120
26872 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 103
26871 Structuring Paraphrases: The Impact Sentence Complexity Has on Key Leader Engagements

Authors: Meaghan Bowman

Abstract:

Soldiers are taught about the importance of effective communication with repetition of the phrase, “Communication is key.” They receive training in preparing for, and carrying out, interactions between foreign and domestic leaders to gain crucial information about a mission. These interactions are known as Key Leader Engagements (KLEs). For the training of KLEs, doctrine mandates the skills needed to conduct these “engagements” such as how to: behave appropriately, identify key leaders, and employ effective strategies. Army officers in training learn how to confront leaders, what information to gain, and how to ask questions respectfully. Unfortunately, soldiers rarely learn how to formulate questions optimally. Since less complex questions are easier to understand, we hypothesize that semantic complexity affects content understanding, and that age and education levels may have an effect on one’s ability to form paraphrases and judge their quality. In this study, we looked at paraphrases of queries as well as judgments of both the paraphrases’ naturalness and their semantic similarity to the query. Queries were divided into three complexity categories based on the number of relations (the first number) and the number of knowledge graph edges (the second number). Two crowd-sourced tasks were completed by Amazon volunteer participants, also known as turkers, to answer the research questions: (i) Are more complex queries harder to paraphrase and judge and (ii) Do age and education level affect the ability to understand complex queries. We ran statistical tests as follows: MANOVA for query understanding and two-way ANOVA to understand the relationship between query complexity and education and age. A probe of the number of given-level queries selected for paraphrasing by crowd-sourced workers in seven age ranges yielded promising results. We found significant evidence that age plays a role and marginally significant evidence that education level plays a role. These preliminary tests, with output p-values of 0.0002 and 0.068, respectively, suggest the importance of content understanding in a communication skill set. This basic ability to communicate, which may differ by age and education, permits reproduction and quality assessment and is crucial in training soldiers for effective participation in KLEs.

Keywords: engagement, key leader, paraphrasing, query complexity, understanding

Procedia PDF Downloads 157
26870 Spatial Patterns of Urban Expansion in Kuwait City between 1989 and 2001

Authors: Saad Algharib, Jay Lee

Abstract:

Urbanization is a complex phenomenon that occurs during the city’s development from one form to another. In other words, it is the process when the activities in the land use/land cover change from rural to urban. Since the oil exploration, Kuwait City has been growing rapidly due to its urbanization and population growth by both natural growth and inward immigration. The main objective of this study is to detect changes in urban land use/land cover and to examine the changing spatial patterns of urban growth in and around Kuwait City between 1989 and 2001. In addition, this study also evaluates the spatial patterns of the changes detected and how they can be related to the spatial configuration of the city. Recently, the use of remote sensing and geographic information systems became very useful and important tools in urban studies because of the integration of them can allow and provide the analysts and planners to detect, monitor and analyze the urban growth in a region effectively. Moreover, both planners and users can predict the trends of the growth in urban areas in the future with remotely sensed and GIS data because they can be effectively updated with required precision levels. In order to identify the new urban areas between 1989 and 2001, the study uses satellite images of the study area and remote sensing technology for classifying these images. Unsupervised classification method was applied to classify images to land use and land cover data layers. After finishing the unsupervised classification method, GIS overlay function was applied to the classified images for detecting the locations and patterns of the new urban areas that developed during the study period. GIS was also utilized to evaluate the distribution of the spatial patterns. For example, Moran’s index was applied for all data inputs to examine the urban growth distribution. Furthermore, this study assesses if the spatial patterns and process of these changes take place in a random fashion or with certain identifiable trends. During the study period, the result of this study indicates that the urban growth has occurred and expanded 10% from 32.4% in 1989 to 42.4% in 2001. Also, the results revealed that the largest increase of the urban area occurred between the major highways after the forth ring road from the center of Kuwait City. Moreover, the spatial distribution of urban growth occurred in cluster manners.

Keywords: geographic information systems, remote sensing, urbanization, urban growth

Procedia PDF Downloads 168
26869 Promoting Authenticity in Employer Brands to Address the Global-Local Problem in Complex Organisations: The Case of a Developing Country

Authors: Saud Al Taj

Abstract:

Employer branding is considered as a useful tool for addressing the global-local problem facing complex organisations that have operations scattered across the globe and face challenges of dealing with the local environment alongside. Despite being an established field of study within the Western developed world, there is little empirical evidence concerning the relevance of employer branding to global companies that operate in the under-developed economies. This paper fills this gap by gaining rich insight into the implementation of employer branding programs in a foreign multinational operating in Pakistan dealing with the global-local problem. The study is qualitative in nature and employs semi-structured and focus group interviews with senior/middle managers and local frontline employees to deeply examine the phenomenon in case organisation. Findings suggest that authenticity is required in employer brands to enable them to respond to the local needs thereby leading to the resolution of the global-local problem. However, the role of signaling theory is key to the development of authentic employer brands as it stresses on the need to establish an efficient and effective signaling environment wherein signals travel in both directions (from signal designers to receivers and backwards) and facilitate firms with the global-local problem. The paper also identifies future avenues of research for the employer branding field.

Keywords: authenticity, counter-signals, employer branding, global-local problem, signaling theory

Procedia PDF Downloads 361
26868 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 417
26867 Comparison of the Results of a Parkinson’s Holter Monitor with Patient Diaries, in Real Conditions of Use: A Sub-Analysis of the MoMoPa-EC Clinical Trial

Authors: Alejandro Rodríguez-Molinero, Carlos Pérez-López, Jorge Hernández-Vara, Àngels Bayes-Rusiñol, Juan Carlos Martínez-Castrillo, David A. Pérez-Martínez

Abstract:

Background: Monitoring motor symptoms in Parkinson's patients is often a complex and time-consuming task for clinicians, as Hauser's diaries are often poorly completed by patients. Recently, new automatic devices (Parkinson's holter: STAT-ON®) have been developed capable of monitoring patients' motor fluctuations. The MoMoPa-EC clinical trial (NCT04176302) investigates which of the two methods produces better clinical results. In this sub-analysis, the concordance between both methods is analyzed. Methods: In the MoMoPa-EC clinical trial, 164 patients with moderate-severe Parkinson's disease and at least two hours a day of Off will be included. At the time of patient recruitment, all of them completed a seven-day motor fluctuation diary at home (Hauser’s diary) while wearing the Parkinson's holter. In this sub-analysis, 71 patients with complete data for the purpose of this comparison were included. The intraclass correlation coefficient was calculated between the patient diary entries and the Parkinson's holter data in terms of time On, Off, and time with dyskinesias. Results: The intra-class correlation coefficient of both methods was 0.57 (95% CI: 0.3-0.74) for daily time in Off (%), 0.48 (95% CI: 0.14-0.68) for daily time in On (%), and 0.37 (95% CI %: -0.04-0.62) for daily time with dyskinesias (%). Conclusions: Both methods have a moderate agreement with each other. We will have to wait for the results of the MoMoPa-EC project to estimate which of them has the greatest clinical benefits. Acknowledgment: This work is supported by AbbVie S.L.U, the Instituto de Salud Carlos III [DTS17/00195], and the European Fund for Regional Development, 'A way to make Europe'.

Keywords: Parkinson, sensor, motor fluctuations, dyskinesia

Procedia PDF Downloads 224
26866 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 59
26865 Gaze Patterns of Skilled and Unskilled Sight Readers Focusing on the Cognitive Processes Involved in Reading Key and Time Signatures

Authors: J. F. Viljoen, Catherine Foxcroft

Abstract:

Expert sight readers rely on their ability to recognize patterns in scores, their inner hearing and prediction skills in order to perform complex sight reading exercises. They also have the ability to observe deviations from expected patterns in musical scores. This increases the “Eye-hand span” (reading ahead of the point of playing) in order to process the elements in the score. The study aims to investigate the gaze patterns of expert and non-expert sight readers focusing on key and time signatures. 20 musicians were tasked with playing 12 sight reading examples composed for one hand and five examples composed for two hands to be performed on a piano keyboard. These examples were composed in different keys and time signatures and included accidentals and changes of time signature to test this theory. Results showed that the experts fixate more and for longer on key and time signatures as well as deviations in examples for two hands than the non-expert group. The inverse was true for the examples for one hand, where expert sight readers showed fewer and shorter fixations on key and time signatures as well as deviations. This seems to suggest that experts focus more on the key and time signatures as well as deviations in complex scores to facilitate sight reading. The examples written for one appeared to be too easy for the expert sight readers, compromising gaze patterns.

Keywords: cognition, eye tracking, musical notation, sight reading

Procedia PDF Downloads 136
26864 A Case Study on the Long-Term Stability Monitoring of Underground Powerhouse Complex Using Geotechnical Instrumentation

Authors: Sudhakar Kadiyala, Sripad R. Naik

Abstract:

Large cavern in Bhutan Himalayas is being monitored since the construction period. The behavior of the cavern is being monitored for last 16 years. Instrumentation includes measurement of convergence of high walls by geodetic monitoring, load on the support systems with load cells and instrumented bolts. Analysis of the results of instrumentation showed that during the construction period of the cavern, the convergence of the cavern varied from 181 - 233 mm in the unit bay area with maximum convergence rate of 2.80mm/day. Whereas during the operational period the total convergence observed was in the range of 21 to 45 mm during a period of 11.30 years with convergence rate of 0.005 to 0.011 mm/day. During the last five years, there were no instances of high tensile stress recorded by the instrumented bolts. Load on the rock bolts have shown stabilization trend at most of the locations. This paper discusses in detail the results of long-term monitoring using the geotechnical instruments and how the data is being used in 3D numerical model to confirm the stability of the cavern.

Keywords: convergence, displacements, geodetic monitoring, long-term stability

Procedia PDF Downloads 177
26863 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 366
26862 Study of Motion of Impurity Ions in Poly(Vinylidene Fluoride) from View Point of Microstructure of Polymer Solid

Authors: Yuichi Anada

Abstract:

Electrical properties of polymer solid is characterized by dielectric relaxation phenomenon. Complex permittivity shows a high dependence on frequency of external stimulation in the broad frequency range from 0.1mHz to 10GHz. The complex-permittivity dispersion gives us a lot of useful information about the molecular motion of polymers and the structure of polymer aggregates. However, the large dispersion of permittivity at low frequencies due to DC conduction of impurity ions often covers the dielectric relaxation in polymer solid. In experimental investigation, many researchers have tried to remove the DC conduction experimentally or analytically for a long time. On the other hand, our laboratory chose another way of research for this problem from the point of view of a reversal in thinking. The way of our research is to use the impurity ions in the DC conduction as a probe to detect the motion of polymer molecules and to investigate the structure of polymer aggregates. In addition to the complex permittivity, the electric modulus and the conductivity relaxation time are strong tools for investigating the ionic motion in DC conduction. In a non-crystalline part of melt-crystallized polymers, free spaces with inhomogeneous size exist between crystallites. As the impurity ions exist in the non-crystalline part and move through these inhomogeneous free spaces, the motion of ions reflects the microstructure of non-crystalline part. The ionic motion of impurity ions in poly(vinylidene fluoride) (PVDF) is investigated in this study. Frequency dependence of the loss permittivity of PVDF shows a characteristic of the direct current (DC) conduction below 1 kHz of frequency at 435 K. The electric modulus-frequency curve shows a characteristic of the dispersion with the single conductivity relaxation time. Namely, it is the Debye-type dispersion. The conductivity relaxation time analyzed from this curve is 0.00003 s at 435 K. From the plot of conductivity relaxation time of PVDF together with the other polymers against permittivity, it was found that there are two group of polymers; one of the group is characterized by small conductivity relaxation time and large permittivity, and another is characterized by large conductivity relaxation time and small permittivity.

Keywords: conductivity relaxation time, electric modulus, ionic motion, permittivity, poly(vinylidene fluoride), DC conduction

Procedia PDF Downloads 167
26861 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 424
26860 Qualitative and Quantitative Characterization of Generated Waste in Nouri Petrochemical Complex, Assaluyeh, Iran

Authors: L. Heidari, M. Jalili Ghazizade

Abstract:

In recent years, different petrochemical complexes have been established to produce aromatic compounds. Among them, Nouri Petrochemical Complex (NPC) is the largest producer of aromatic raw materials in the world, and is located in south of Iran. Environmental concerns have been raised in this region due to generation of different types of solid waste generated in the process of aromatics production, and subsequently, industrial waste characterization has been thoroughly considered. The aim of this study is qualitative and quantitative characterization of industrial waste generated in the aromatics production process and determination of the best method for industrial waste management. For this purpose, all generated industrial waste during the production process was determined using a checklist. Four main industrial wastes were identified as follows: spent industrial soil, spent catalyst, spent molecular sieves and spent N-formyl morpholine (NFM) solvent. The amount of heavy metals and organic compounds in these wastes were further measured in order to identify the nature and toxicity of such a dangerous compound. Then industrial wastes were classified based on lab analysis results as well as using different international lists of hazardous waste identification such as EPA, UNEP and Basel Convention. Finally, the best method of waste disposal is selected based on environmental, economic and technical aspects. 

Keywords: aromatic compounds, industrial soil, molecular sieve, normal formyl morpholine solvent

Procedia PDF Downloads 229
26859 Satellite-Based Drought Monitoring in Korea: Methodologies and Merits

Authors: Joo-Heon Lee, Seo-Yeon Park, Chanyang Sur, Ho-Won Jang

Abstract:

Satellite-based remote sensing technique has been widely used in the area of drought and environmental monitoring to overcome the weakness of in-situ based monitoring. There are many advantages of remote sensing for drought watch in terms of data accessibility, monitoring resolution and types of available hydro-meteorological data including environmental areas. This study was focused on the applicability of drought monitoring based on satellite imageries by applying to the historical drought events, which had a huge impact on meteorological, agricultural, and hydrological drought. Satellite-based drought indices, the Standardized Precipitation Index (SPI) using Tropical Rainfall Measuring Mission (TRMM) and Global Precipitation Mission (GPM); Vegetation Health Index (VHI) using MODIS based Land Surface Temperature (LST), and Normalized Difference Vegetation Index (NDVI); and Scaled Drought Condition Index (SDCI) were evaluated to assess its capability to analyze the complex topography of the Korean peninsula. While the VHI was accurate when capturing moderate drought conditions in agricultural drought-damaged areas, the SDCI was relatively well monitored in hydrological drought-damaged areas. In addition, this study found correlations among various drought indices and applicability using Receiver Operating Characteristic (ROC) method, which will expand our understanding of the relationships between hydro-meteorological variables and drought events at global scale. The results of this research are expected to assist decision makers in taking timely and appropriate action in order to save millions of lives in drought-damaged areas.

Keywords: drought monitoring, moderate resolution imaging spectroradiometer (MODIS), remote sensing, receiver operating characteristic (ROC)

Procedia PDF Downloads 322
26858 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 268
26857 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism

Authors: Lubos Rojka

Abstract:

The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.

Keywords: consciousness, free will, determinism, emergence, moral responsibility

Procedia PDF Downloads 162
26856 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 250
26855 Sense of the Place and Human Multisensory Perceptions: The Case of Kerman Old Bazaar Scents

Authors: Sabra Saeidi

Abstract:

When we talk about tangible heritage, the first thing that comes to mind is historic places: what they look like, who made them, and what materials they are made of. But each monument is not limited to its physical constituents and is a complex and related set of human perceptions, memories, narratives, and the structure that shapes its character. In this article, based on the ideology of two great architects, Juhani Pallasmaa and Christian Norberg-Schulz, we discussed the sense of the place and how the human presence in a place with all its senses (visual, auditory, tactile, olfactory, taste) gives life and value to it. This value is all about feeling and definitions and is recorded in the form of our memoirs. An attempt has been made to conclude that our perception of the environment, by our sensory tools, is an intangible and thematic heritage itself, whose existence depends on our existence and has no less value than monuments' physical form and structure. The sense of smell is one of the most powerful, personal and inexplicable, unrecorded, and unexpressed senses and has a solid connection with our memories. by reviewing the case of Kerman Bazaar and its change of use in recent years, we define that one of the ways to protect the olfactory heritage of this valuable complex is to draw a Smellscape: a way to record the moment of present and past memories. Smellscapes are tools for transferring the sense of smell to a visual form to record scents and understand them in a more comprehensive, common, and artistic form.

Keywords: sence of the place, spirit of the place, smellscape, multisensory perception

Procedia PDF Downloads 108
26854 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 483
26853 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 550
26852 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 159