Search results for: complex interactions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6980

Search results for: complex interactions

2360 Effect of Joule Heating on Chemically Reacting Micropolar Fluid Flow over Truncated Cone with Convective Boundary Condition Using Spectral Quasilinearization Method

Authors: Pradeepa Teegala, Ramreddy Chetteti

Abstract:

This work emphasizes the effects of heat generation/absorption and Joule heating on chemically reacting micropolar fluid flow over a truncated cone with convective boundary condition. For this complex fluid flow problem, the similarity solution does not exist and hence using non-similarity transformations, the governing fluid flow equations along with related boundary conditions are transformed into a set of non-dimensional partial differential equations. Several authors have applied the spectral quasi-linearization method to solve the ordinary differential equations, but here the resulting nonlinear partial differential equations are solved for non-similarity solution by using a recently developed method called the spectral quasi-linearization method (SQLM). Comparison with previously published work on special cases of the problem is performed and found to be in excellent agreement. The influence of pertinent parameters namely Biot number, Joule heating, heat generation/absorption, chemical reaction, micropolar and magnetic field on physical quantities of the flow are displayed through graphs and the salient features are explored in detail. Further, the results are analyzed by comparing with two special cases, namely, vertical plate and full cone wherever possible.

Keywords: chemical reaction, convective boundary condition, joule heating, micropolar fluid, spectral quasilinearization method

Procedia PDF Downloads 346
2359 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 198
2358 Event Related Brain Potentials Evoked by Carmen in Musicians and Dancers

Authors: Hanna Poikonen, Petri Toiviainen, Mari Tervaniemi

Abstract:

Event-related potentials (ERPs) evoked by simple tones in the brain have been extensively studied. However, in reality the music surrounding us is spectrally and temporally complex and dynamic. Thus, the research using natural sounds is crucial in understanding the operation of the brain in its natural environment. Music is an excellent example of natural stimulation, which, in various forms, has always been an essential part of different cultures. In addition to sensory responses, music elicits vast cognitive and emotional processes in the brain. When compared to laymen, professional musicians have stronger ERP responses in processing individual musical features in simple tone sequences, such as changes in pitch, timbre and harmony. Here we show that the ERP responses evoked by rapid changes in individual musical features are more intense in musicians than in laymen, also while listening to long excerpts of the composition Carmen. Interestingly, for professional dancers, the amplitudes of the cognitive P300 response are weaker than for musicians but still stronger than for laymen. Also, the cognitive P300 latencies of musicians are significantly shorter whereas the latencies of laymen are significantly longer. In contrast, sensory N100 do not differ in amplitude or latency between musicians and laymen. These results, acquired from a novel ERP methodology for natural music, suggest that we can take the leap of studying the brain with long pieces of natural music also with the ERP method of electroencephalography (EEG), as has already been made with functional magnetic resonance (fMRI), as these two brain imaging devices complement each other.

Keywords: electroencephalography, expertise, musical features, real-life music

Procedia PDF Downloads 481
2357 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics

Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo

Abstract:

Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.

Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model

Procedia PDF Downloads 153
2356 An Investigation on MgAl₂O₄ Based Mould System in Investment Casting Titanium Alloy

Authors: Chen Yuan, Nick Green, Stuart Blackburn

Abstract:

The investment casting process offers a great freedom of design combined with the economic advantage of near net shape manufacturing. It is widely used for the production of high value precision cast parts in particularly in the aerospace sector. Various combinations of materials have been used to produce the ceramic moulds, but most investment foundries use a silica based binder system in conjunction with fused silica, zircon, and alumino-silicate refractories as both filler and coarse stucco materials. However, in the context of advancing alloy technologies, silica based systems are struggling to keep pace, especially when net-shape casting titanium alloys. Study has shown that the casting of titanium based alloys presents considerable problems, including the extensive interactions between the metal and refractory, and the majority of metal-mould interaction is due to reduction of silica, present as binder and filler phases, by titanium in the molten state. Cleaner, more refractory systems are being devised to accommodate these changes. Although yttria has excellent chemical inertness to titanium alloy, it is not very practical in a production environment combining high material cost, short slurry life, and poor sintering properties. There needs to be a cost effective solution to these issues. With limited options for using pure oxides, in this work, a silica-free magnesia spinel MgAl₂O₄ was used as a primary coat filler and alumina as a binder material to produce facecoat in the investment casting mould. A comparison system was also studied with a fraction of the rare earth oxide Y₂O₃ adding into the filler to increase the inertness. The stability of the MgAl₂O₄/Al₂O₃ and MgAl₂O₄/Y₂O₃/Al₂O₃ slurries was assessed by tests, including pH, viscosity, zeta-potential and plate weight measurement, and mould properties such as friability were also measured. The interaction between the face coat and titanium alloy was studied by both a flash re-melting technique and a centrifugal investment casting method. The interaction products between metal and mould were characterized using x-ray diffraction (XRD), scanning electron microscopy (SEM) and Energy Dispersive X-Ray Spectroscopy (EDS). The depth of the oxygen hardened layer was evaluated by micro hardness measurement. Results reveal that introducing a fraction of Y₂O₃ into magnesia spinel can significantly increase the slurry life and reduce the thickness of hardened layer during centrifugal casting.

Keywords: titanium alloy, mould, MgAl₂O₄, Y₂O₃, interaction, investment casting

Procedia PDF Downloads 111
2355 Using a Card Game as a Tool for Developing a Design

Authors: Matthias Haenisch, Katharina Hermann, Marc Godau, Verena Weidner

Abstract:

Over the past two decades, international music education has been characterized by a growing interest in informal learning for formal contexts and a "compositional turn" that has moved from closed to open forms of composing. This change occurs under social and technological conditions that permeate 21st-century musical practices. This forms the background of Musical Communities in the (Post)Digital Age (MusCoDA), a four-year joint research project of the University of Erfurt (UE) and the University of Education Karlsruhe (PHK), funded by the German Federal Ministry of Education and Research (BMBF). Both explore songwriting processes as an example of collective creativity in (post)digital communities, one in formal and the other in informal learning contexts. Collective songwriting will be studied from a network perspective, that will allow us to view boundaries between both online and offline as well as formal and informal or hybrid contexts as permeable and to reconstruct musical learning practices. By comparing these songwriting processes, possibilities for a pedagogical-didactic interweaving of different educational worlds are highlighted. Therefore, the subproject of the University of Erfurt investigates school music lessons with the help of interviews, videography, and network maps by analyzing new digital pedagogical and didactic possibilities. In the first step, the international literature on songwriting in the music classroom was examined for design development. The analysis focused on the question of which methods and practices are circulating in the current literature. Results from this stage of the project form the basis for the first instructional design that will help teachers in planning regular music classes and subsequently reconstruct musical learning practices under these conditions. In analyzing the literature, we noticed certain structural methods and concepts that recur, such as the Building Blocks method and the pre-structuring of the songwriting process. From these findings, we developed a deck of cards that both captures the current state of research and serves as a method for design development. With this deck of cards, both teachers and students themselves can plan their individual songwriting lessons by independently selecting and arranging topic, structure, and action cards. In terms of science communication, music educators' interactions with the card game provide us with essential insights for developing the first design. The overall goal of MusCoDA is to develop an empirical model of collective musical creativity and learning and an instructional design for teaching music in the postdigital age.

Keywords: card game, collective songwriting, community of practice, network, postdigital

Procedia PDF Downloads 63
2354 Predictive Analytics of Bike Sharing Rider Parameters

Authors: Bongs Lainjo

Abstract:

The evolution and escalation of bike-sharing programs (BSP) continue unabated. Since the sixties, many countries have introduced different models and strategies of BSP. These include variations ranging from dockless models to electronic real-time monitoring systems. Reasons for using this BSP include recreation, errands, work, etc. And there is all indication that complex, and more innovative rider-friendly systems are yet to be introduced. The objective of this paper is to analyze current variables established by different operators and streamline them identifying the most compelling ones using analytics. Given the contents of available databases, there is a lack of uniformity and common standard on what is required and what is not. Two factors appear to be common: user type (registered and unregistered, and duration of each trip). This article uses historical data provided by one operator based in the greater Washington, District of Columbia, USA area. Several variables including categorical and continuous data types were screened. Eight out of 18 were considered acceptable and significantly contribute to determining a useful and reliable predictive model. Bike-sharing systems have become popular in recent years all around the world. Although this trend has resulted in many studies on public cycling systems, there have been few previous studies on the factors influencing public bicycle travel behavior. A bike-sharing system is a computer-controlled system in which individuals can borrow bikes for a fee or free for a limited period. This study has identified unprecedented useful, and pragmatic parameters required in improving BSP ridership dynamics.

Keywords: sharing program, historical data, parameters, ridership dynamics, trip duration

Procedia PDF Downloads 138
2353 Stakeholders' Engagement Process in the OBSERVE Project

Authors: Elisa Silva, Rui Lança, Fátima Farinha, Miguel José Oliveira, Manuel Duarte Pinheiro, Cátia Miguel

Abstract:

Tourism is one of the global engines of development. With good planning and management, it can be a positive force, bringing benefits to touristic destinations around the world. However, without constrains, boundaries well established and constant survey, tourism can be very harmful and induce destination’s degradation. In the interest of the tourism sector and the community it is important to develop the destination maintaining its sustainability. The OBSERVE project is an instrument for monitoring and evaluating the sustainability of the region of Algarve. Its main priority is to provide environmental, economic, social-cultural and institutional indicators to support the decision-making process towards a sustainable growth. In the pursuit of the objectives, it is being developed a digital platform where the significant indicators will be continuously updated. It is known that the successful development of a touristic region depends from the careful planning with the commitment of central and regional government, industry, services and community stakeholders. Understand the different perspectives of stakeholders is essential to engage them in the development planning. However, actual stakeholders’ engagement process is complex and not easy to accomplish. To create a consistent system of indicators designed to monitor and evaluate the sustainability performance of a touristic region it is necessary to access the local data and the consideration of the full range of values and uncertainties. This paper presents the OBSERVE project and describes the stakeholders´ engagement process highlighting the contributions, ambitions and constraints.

Keywords: sustainable tourism, stakeholders' engagement, OBSERVE project, Algarve region

Procedia PDF Downloads 168
2352 The Use of Political Savviness in Dealing with Workplace Ostracism: A Social Information Processing Perspective

Authors: Amy Y. Wang, Eko L. Yi

Abstract:

Can vicarious experiences of workplace ostracism affect employees’ willingness to voice? Given the increasingly interdependent nature of the modern workplace in which employees rely on social interactions to fulfill organizational goals, workplace ostracism –the extent to which an individual perceives that he or she is ignored or excluded by others in the workplace– has garnered significant interest from scholars and practitioners alike. Extending beyond conventional studies that largely focus on the perspectives and outcomes of ostracized targets, we address the indirect effects of workplace ostracism on third-party employees embedded in the same social context. Using a social information processing approach, we propose that the ostracism of coworkers acts as political information that influences third-party employees in their decisions to engage in risky and discretionary behaviors such as employee voice. To make sense of and to navigate through experiences of workplace ostracism, we posit that both political understanding and political skill allow third party employees to minimize the risks and uncertainty of voicing. This conceptual model was tested by a study involving 154 supervisor-subordinate dyads of a publicly listed bio-technology firm located in Mainland China. Each supervisor and their direct subordinates composed of a work team; each team had a minimum of two subordinates and a maximum of four subordinates. Human resources used the master list to distribute the ID coded questionnaires to the matching names. All studied constructs were measured using existing scales proved effective in previous literature. Hypotheses were tested using Confirmatory Factor Analysis and Hierarchal Multiple Regression. All three hypotheses were supported which showed that employees were less likely to engage in voice behaviors when their coworkers reported having experienced ostracism in the workplace. Results also showed a significant three-way interaction between political understanding and political skill on the relationship between coworkers’ ostracism and employee voice, indicating that political savviness is a valuable resource in mitigating ostracism’s negative and indirect effects. Our results illustrated that an employee’s coworkers being ostracized indeed adversely impacted his or her own voice behavior. However, not all individuals reacted passively to the social context; rather, we found that politically savvy individuals – possessing both political understanding and political skill – and their voice behaviors were less impacted by ostracism in their work environment. At the same time, we found that having only political understanding or only political skill was significantly less effective in mitigating ostracism’s negative effects, suggesting a necessary duality of political knowledge and political skill in combatting ostracism. Organizational implications, recommendations, and future research ideas are also discussed.

Keywords: employee voice, organizational politics, social information processing, workplace ostracism

Procedia PDF Downloads 139
2351 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 130
2350 Optimising Light Conditions for Recombinant Protein Production in the Microalgal Chlamydomonas reinhardtii Chloroplast

Authors: Saskya E. Carrera P., Ben Hankamer, Melanie Oey

Abstract:

The green alga C. reinhardtii provides a platform for the cheap, scalable, and safe production of complex proteins. Despite gene expression in photosynthetic organisms being tightly regulated by light, most expression studies have analysed chloroplast recombinant protein production under constant light. Here the influence of illumination time and intensity on GFP and a GFP-PlyGBS (bacterial-lysin) fusion protein expression was investigated. The expression of both proteins was strongly influenced by the light regime (6-24 hr illumination per day), the light intensity (0-450 E m⁻²s⁻¹) and growth condition (photoautotrophic, mixotrophic and heterotrophic). Heterotrophic conditions resulted in relatively low recombinant protein yields per unit volume, despite high protein yields per cell, due to low growth rates. Mixotrophic conditions exhibited the highest yields at 6 hrs illumination at 200µE m⁻²s⁻¹ and under continuous low light illumination (13-16 mg L⁻¹ GFP and 1.2-1.6 mg L⁻¹ GFP-PlyGBS), as these conditions supported good cell growth and cellular protein yields. A ~23-fold increase in protein accumulation per cell and ~9-fold increase L⁻¹ culture was observed compared to standard constant 24 hr illumination for GFP-PlyGBS. The highest yields under photoautotrophic conditions were obtained under 9 hrs illumination (6 mg L⁻¹ GFP and 2.1 mg L⁻¹ GFP-PlyGBS). This represents a ~4-fold increase in cellular protein accumulation for GFP-PlyGBS. On a volumetric basis the highest yield was at 15 hrs illumination (~2-fold increase L⁻¹ over the constant light for GFP-PlyGBS). Optimising illumination conditions to balance growth and protein expression can thus significantly enhance overall recombinant protein production in C. reinhardtii cultures.

Keywords: chlamydomonas reinhardtii, light, mixotrophic, recombinant protein

Procedia PDF Downloads 252
2349 Effect of Cellular Water Transport on Deformation of Food Material during Drying

Authors: M. Imran Hossen Khan, M. Mahiuddin, M. A. Karim

Abstract:

Drying is a food processing technique where simultaneous heat and mass transfer take place from surface to the center of the sample. Deformation of food materials during drying is a common physical phenomenon which affects the textural quality and taste of the dried product. Most of the plant-based food materials are porous and hygroscopic in nature that contains about 80-90% water in different cellular environments: intercellular environment and intracellular environment. Transport of this cellular water has a significant effect on material deformation during drying. However, understanding of the scale of deformation is very complex due to diverse nature and structural heterogeneity of food material. Knowledge about the effect of transport of cellular water on deformation of material during drying is crucial for increasing the energy efficiency and obtaining better quality dried foods. Therefore, the primary aim of this work is to investigate the effect of intracellular water transport on material deformation during drying. In this study, apple tissue was taken for the investigation. The experiment was carried out using 1H-NMR T2 relaxometry with a conventional dryer. The experimental results are consistent with the understanding that transport of intracellular water causes cellular shrinkage associated with the anisotropic deformation of whole apple tissue. Interestingly, it is found that the deformation of apple tissue takes place at different stages of drying rather than deforming at one time. Moreover, it is found that the penetration rate of heat energy together with the pressure gradient between intracellular and intercellular environments is the responsible force to rupture the cell membrane.

Keywords: heat and mass transfer, food material, intracellular water, cell rupture, deformation

Procedia PDF Downloads 218
2348 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 107
2347 A Dynamic Mechanical Thermal T-Peel Test Approach to Characterize Interfacial Behavior of Polymeric Textile Composites

Authors: J. R. Büttler, T. Pham

Abstract:

Basic understanding of interfacial mechanisms is of importance for the development of polymer composites. For this purpose, we need techniques to analyze the quality of interphases, their chemical and physical interactions and their strength and fracture resistance. In order to investigate the interfacial phenomena in detail, advanced characterization techniques are favorable. Dynamic mechanical thermal analysis (DMTA) using a rheological system is a sensitive tool. T-peel tests were performed with this system, to investigate the temperature-dependent peel behavior of woven textile composites. A model system was made of polyamide (PA) woven fabric laminated with films of polypropylene (PP) or PP modified by grafting with maleic anhydride (PP-g-MAH). Firstly, control measurements were performed with solely PP matrixes. Polymer melt investigations, as well as the extensional stress, extensional viscosity and extensional relaxation modulus at -10°C, 100 °C and 170 °C, demonstrate similar viscoelastic behavior for films made of PP-g-MAH and its non-modified PP-control. Frequency sweeps have shown that PP-g-MAH has a zero phase viscosity of around 1600 Pa·s and PP-control has a similar zero phase viscosity of 1345 Pa·s. Also, the gelation points are similar at 2.42*104 Pa (118 rad/s) and 2.81*104 Pa (161 rad/s) for PP-control and PP-g-MAH, respectively. Secondly, the textile composite was analyzed. The extensional stress of PA66 fabric laminated with either PP-control or PP-g-MAH at -10 °C, 25 °C and 170 °C for strain rates of 0.001 – 1 s-1 was investigated. The laminates containing the modified PP need more stress for T-peeling. However, the strengthening effect due to the modification decreases by increasing temperature and at 170 °C, just above the melting temperature of the matrix, the difference disappears. Independent of the matrix used in the textile composite, there is a decrease of extensional stress by increasing temperature. It appears that the more viscous is the matrix, the weaker the laminar adhesion. Possibly, the measurement is influenced by the fact that the laminate becomes stiffer at lower temperatures. Adhesive lap-shear testing at room temperature supports the findings obtained with the T-peel test. Additional analysis of the textile composite at the microscopic level ensures that the fibers are well embedded in the matrix. Atomic force microscopy (AFM) imaging of a cross section of the composite shows no gaps between the fibers and matrix. Measurements of the water contact angle show that the MAH grafted PP is more polar than the virgin-PP, and that suggests a more favorable chemical interaction of PP-g-MAH with PA, compared to the non-modified PP. In fact, this study indicates that T-peel testing by DMTA is a technique to achieve more insights into polymeric textile composites.

Keywords: dynamic mechanical thermal analysis, interphase, polyamide, polypropylene, textile composite

Procedia PDF Downloads 128
2346 Project Time and Quality Management during Construction

Authors: Nahed Al-Hajeri

Abstract:

Time and cost is an integral part of every construction plan and can affect each party’s contractual obligations. The performance of both time and cost are usually important to the client and contractor during the project. Almost all construction projects are experiencing time overrun. These time overruns always contributed as expensive to both client and contractor. Construction of any project inside the gathering centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. It also involves many agencies interdependent on each other like the vendors, structural and functional designers including various types of specialized engineers and it includes support of contractors and specialized contractors. This paper mainly highlights the types of construction delays due to which project suffer time and cost overrun. This paper also speaks about the delay causes and factors that contribute to the construction sequence delay for the oil and gas projects. Construction delay is supposed to be one of the repeated problems in the construction projects and it has an opposing effect on project success in terms of time, cost and quality. Some effective methods are identified to minimize delays in construction projects such as: 1. Site management and supervision, 2. Effective strategic planning, 3. Clear information and communication channel. Our research paper studies the types of delay with some real examples with statistic results and suggests solutions to overcome this problem.

Keywords: non-compensable delay, delays caused by force majeure, compensable delay, delays caused by the owner or the owner’s representative, non-excusable delay, delay caused by the contractor or the contractor’s representative, concurrent delay, delays resulting from two separate causes at the same time

Procedia PDF Downloads 239
2345 Culturable Diversity of Halophilic Bacteria in Chott Tinsilt, Algeria

Authors: Nesrine Lenchi, Salima Kebbouche-Gana, Laddada Belaid, Mohamed Lamine Khelfaoui, Mohamed Lamine Gana

Abstract:

Saline lakes are extreme hypersaline environments that are considered five to ten times saltier than seawater (150 – 300 g L-1 salt concentration). Hypersaline regions differ from each other in terms of salt concentration, chemical composition and geographical location, which determine the nature of inhabitant microorganisms. In order to explore the diversity of moderate and extreme halophiles Bacteria in Chott Tinsilt (East of Algeria), an isolation program was performed. In the first time, water samples were collected from the saltern during pre-salt harvesting phase. Salinity, pH and temperature of the sampling site were determined in situ. Chemical analysis of water sample indicated that Na +and Cl- were the most abundant ions. Isolates were obtained by plating out the samples in complex and synthetic media. In this study, seven halophiles cultures of Bacteria were isolated. Isolates were studied for Gram’s reaction, cell morphology and pigmentation. Enzymatic assays (oxidase, catalase, nitrate reductase and urease), and optimization of growth conditions were done. The results indicated that the salinity optima varied from 50 to 250 g L-1, whereas the optimum of temperature range from 25°C to 35°C. Molecular identification of the isolates was performed by sequencing the 16S rRNA gene. The results showed that these cultured isolates included members belonging to the Halomonas, Staphylococcus, Salinivibrio, Idiomarina, Halobacillus Thalassobacillus and Planococcus genera and what may represent a new bacterial genus.

Keywords: bacteria, Chott, halophilic, 16S rRNA

Procedia PDF Downloads 278
2344 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 436
2343 Cryptocurrency Realities: Insights from Social and Economic Psychology

Authors: Sarah Marie

Abstract:

In today's dynamic financial landscape, cryptocurrencies represent a paradigm shift characterized by innovation and intense debate. This study probes into their transformative potential and the challenges they present, offering a balanced perspective that recognizes both their promise and pitfalls. Emulating the engaging style of a TED Talk, this research goes beyond academic analysis, serving as a critical bridge to reconcile the perspectives of cryptocurrency skeptics and enthusiasts, fostering a well-informed dialogue. The study employs a mixed-method approach, analyzing current trends, regulatory landscapes, and public perceptions in the cryptocurrency domain. It distinguishes genuine innovators in this field from ostentatious opportunists, echoing the sentiment that real innovation should be separated from mere showmanship. If one is unfamiliar with who is being referenced, they can likely spot them leaning against their Lamborghinis outside "Crypto" conventions, looking greasy. Major findings reveal a complex scenario dominated by regulatory uncertainties, market volatility, and security issues, emphasizing the need for a coherent regulatory framework that balances innovation with risk management and sustainable practices. The study underscores the importance of transparency and consumer protection in fostering responsible growth within the cryptocurrency ecosystem. In conclusion, the research advocates for education, innovation, and ethical governance in the realm of cryptocurrencies. It calls for collaborative efforts to navigate the intricacies of this evolving landscape and to realize its full potential in a responsible, inclusive, and forward-thinking manner.

Keywords: financial landscape, innovation, public perception, transparency

Procedia PDF Downloads 50
2342 Systematic Discovery of Bacterial Toxins Against Plants Pathogens Fungi

Authors: Yaara Oppenheimer-Shaanan, Nimrod Nachmias, Marina Campos Rocha, Neta Schlezinger, Noam Dotan, Asaf Levy

Abstract:

Fusarium oxysporum, a fungus that attacks a broad range of plants and can cause infections in humans, operates across different kingdoms. This pathogen encounters varied conditions, such as temperature, pH, and nutrient availability, in plant and human hosts. The Fusarium oxysporum species complex, pervasive in soils globally, can affect numerous plants, including key crops like tomatoes and bananas. Controlling Fusarium infections can involve biocontrol agents that hinder the growth of harmful strains. Our research developed a computational method to identify toxin domains within a vast number of microbial genomes, leading to the discovery of nine distinct toxins capable of killing bacteria and fungi, including Fusarium. These toxins appear to function as enzymes, causing significant damage to cellular structures, membranes and DNA. We explored biological control using bacteria that produce polymorphic toxins, finding that certain bacteria, non-pathogenic to plants, offer a safe biological alternative for Fusarium management, as they did not harm macrophage cells or C. elegans. Additionally, we elucidated the 3D structures of two toxins with their protective immunity proteins, revealing their function as unique DNases. These potent toxins are likely instrumental in microbial competition within plant ecosystems and could serve as biocontrol agents to mitigate Fusarium wilt and related diseases.

Keywords: microbial toxins, antifungal, Fusarium oxysporum, bacterial-fungal intreactions

Procedia PDF Downloads 53
2341 Isolation, Purification and Characterisation of Non-Digestible Oligosaccharides Derived from Extracellular Polysaccharide of Antarctic Fungus Thelebolus Sp. IITKGP-BT12

Authors: Abinaya Balasubramanian, Satyabrata Ghosh, Satyahari Dey

Abstract:

Non-Digestible Oligosaccharides(NDOs) are low molecular weight carbohydrates with degree of polymerization (DP) 3-20, that are delivered intact to the large intestine. NDOs are gaining attention as effective prebiotic molecules that facilitate prevention and treatment of several chronic diseases. Recently, NDOs are being obtained by cleaving complex polysaccharides as it results in high yield and also as the former tend to display greater bioactivity. Thelebolus sp. IITKGP BT-12, a recently identified psychrophilic, Ascomycetes fungus has been reported to produce a bioactive extracellular polysaccharide(EPS). The EPS has been proved to possess strong prebiotic activity and anti- proliferative effects. The current study is an attempt to identify and optimise the most suitable method for hydrolysis of the above mentioned novel EPS into NDOs, and further purify and characterise the same. Among physical, chemical and enzymatic methods, enzymatic hydrolysis was identified as the best method and the optimum hydrolysis conditions obtained using response surface methodology were: reaction time of 24h, β-(1,3) endo-glucanase concentration of 0.53U and substrate concentration of 10 mg/ml. The NDOs were purified using gel filtration chromatography and their molecular weights were determined using MALDI-TOF. The major fraction was found to have a DP of 7,8. The monomeric units of the NDOs were confirmed to be glucose using TLC and GCMS-MS analysis. The obtained oligosaccharides proved to be non-digestible when subjected to gastric acidity, salivary and pancreatic amylases and hence could serve as efficient prebiotics.

Keywords: characterisation, enzymatic hydrolysis, non-digestible oligosaccharides, response surface methodology

Procedia PDF Downloads 127
2340 Influence of Specimen Geometry (10*10*40), (12*12*60) and (5*20*120), on Determination of Toughness of Concrete Measurement of Critical Stress Intensity Factor: A Comparative Study

Authors: M. Benzerara, B. Redjel, B. Kebaili

Abstract:

The cracking of the concrete is a more crucial problem with the development of the complex structures related to technological progress. The projections in the knowledge of the breaking process make it possible today for better prevention of the risk of the fracture. The breaking strength brutal of a quasi-fragile material like the concrete called Toughness is measured by a breaking value of the factor of the intensity of the constraints K1C for which the crack is propagated, it is an intrinsic property of the material. Many studies reported in the literature treating of the concrete were carried out on specimens which are in fact inadequate compared to the intrinsic characteristic to identify. We started from this established fact, in order to compare the evolution of the parameter of toughness K1C measured by calling upon ordinary concrete specimens of three prismatic geometries different (10*10*40) Cm3, (12*12*60) Cm3 & (5*20*120) Cm3 containing from the side notches various depths simulating of the cracks was set up.The notches are carried out using triangular pyramidal plates into manufactured out of sheet coated placed at the center of the specimens at the time of the casting, then withdrawn to leave the trace of a crack. The tests are carried out in 3 points bending test in mode 1 of fracture, by using the techniques of mechanical fracture. The evolution of the parameter of toughness K1C measured with the three geometries specimens gives almost the same results. They are acceptable and return in the beach of the results determined by various researchers (toughness of the ordinary concrete turns to the turn of the 1 MPa √m). These results inform us about the presence of an economy on the level of the geometry specimen (5*20*120) Cm3, therefore, to use plates specimens later if one wants to master the toughness of this material complexes, astonishing but always essential that is the concrete.

Keywords: concrete, fissure, specimen, toughness

Procedia PDF Downloads 297
2339 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics

Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh

Abstract:

In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.

Keywords: bond ball mill, population balance model, product size distribution, vertical stirred mill

Procedia PDF Downloads 291
2338 Observation of the Orthodontic Tooth's Long-Term Movement Using Stereovision System

Authors: Hao-Yuan Tseng, Chuan-Yang Chang, Ying-Hui Chen, Sheng-Che Chen, Chih-Han Chang

Abstract:

Orthodontic tooth treatment has demonstrated a high success rate in clinical studies. It has been agreed upon that orthodontic tooth movement is based on the ability of surrounding bone and periodontal ligament (PDL) to react to a mechanical stimulus with remodeling processes. However, the mechanism of the tooth movement is still unclear. Recent studies focus on the simple principle compression-tension theory while rare studies directly measure tooth movement. Therefore, tracking tooth movement information during orthodontic treatment is very important in clinical practice. The aim of this study is to investigate the mechanism responses of the tooth movement during the orthodontic treatments. A stereovision system applied to track the tooth movement of the patient with the stamp brackets. The system was established by two cameras with their relative position calibrate. And the orthodontic force measured by 3D printing model with the six-axis load cell to determine the initial force application. The result shows that the stereovision system accuracy revealed the measurement presents a maximum error less than 2%. For the study on patient tracking, the incisor moved about 0.9 mm during 60 days tracking, and half of movement occurred in the first few hours. After removing the orthodontic force in 100 hours, the distance between before and after position incisor tooth decrease 0.5 mm consisted with the release of the phenomenon. Using the stereovision system can accurately locate the three-dimensional position of the teeth and superposition of 3D coordinate system for all the data to integrate the complex tooth movement.

Keywords: orthodontic treatment, tooth movement, stereovision system, long-term tracking

Procedia PDF Downloads 421
2337 Gender Stereotypes at the Court of Georgia: Perceptions of Attorneys on Gender Bias

Authors: Tatia Kekelia

Abstract:

This paper is part of an ongoing research addressing gender discrimination in the Court of Georgia. The research suggests that gender stereotypes influence the processes at the Court in contemporary Georgia, which causes uneven fights for women and men, not to mention other gender identities. The sub-hypothesis proposes that the gender stereotypes derive from feudal representations, which persisted during the Soviet rule. It is precisely those stereotypes that feed gender-based discrimination today. However, this paper’s main focus is on the main hypothesis, describing the revealed stereotypes, and identifying the Court as a place where their presence is most hindering societal development. First of all, this happens by demotivating people, causing loss of trust in the Court, and therefore potentially encouraging crime. Secondly, it becomes harder to adequately mobilize human resources, since more than a half of the population is female, and under the influence of rigid or more subtle forms of discrimination, they lose not only equal rights, but also the motivation to work or fight for them. Consequently, this paper falls under democracy studies as well – considering that an unbiased Court is one of the most important criteria for assessing the democratic character of a state. As the research crosses the disciplines of sociology, law, and history, a complex of qualitative research methods is applied, among which this paper relies mainly on expert interviews, interviews with attorneys, and desk research. By showcasing and undermining the gender stereotypes that work at the Court of Georgia, this research might assist in rising trust towards it in the long-term. As for the broader relevance, the study of the Georgian case opens the possibility to conduct comparative analyses in the region and the continent, and, presumably, carve the lines of cultural influences.

Keywords: gender, stereotypes, bias, democratization, judiciary

Procedia PDF Downloads 78
2336 Development of Nondestructive Imaging Analysis Method Using Muonic X-Ray with a Double-Sided Silicon Strip Detector

Authors: I-Huan Chiu, Kazuhiko Ninomiya, Shin’ichiro Takeda, Meito Kajino, Miho Katsuragawa, Shunsaku Nagasawa, Atsushi Shinohara, Tadayuki Takahashi, Ryota Tomaru, Shin Watanabe, Goro Yabu

Abstract:

In recent years, a nondestructive elemental analysis method based on muonic X-ray measurements has been developed and applied for various samples. Muonic X-rays are emitted after the formation of a muonic atom, which occurs when a negatively charged muon is captured in a muon atomic orbit around the nucleus. Because muonic X-rays have higher energy than electronic X-rays due to the muon mass, they can be measured without being absorbed by a material. Thus, estimating the two-dimensional (2D) elemental distribution of a sample became possible using an X-ray imaging detector. In this work, we report a non-destructive imaging experiment using muonic X-rays at Japan Proton Accelerator Research Complex. The irradiated target consisted of polypropylene material, and a double-sided silicon strip detector, which was developed as an imaging detector for astronomical observation, was employed. A peak corresponding to muonic X-rays from the carbon atoms in the target was clearly observed in the energy spectrum at an energy of 14 keV, and 2D visualizations were successfully reconstructed to reveal the projection image from the target. This result demonstrates the potential of the non-destructive elemental imaging method that is based on muonic X-ray measurement. To obtain a higher position resolution for imaging a smaller target, a new detector system will be developed to improve the statistical analysis in further research.

Keywords: DSSD, muon, muonic X-ray, imaging, non-destructive analysis

Procedia PDF Downloads 204
2335 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 224
2334 Performance Analysis of Vision-Based Transparent Obstacle Avoidance for Construction Robots

Authors: Siwei Chang, Heng Li, Haitao Wu, Xin Fang

Abstract:

Construction robots are receiving more and more attention as a promising solution to the manpower shortage issue in the construction industry. The development of intelligent control techniques that assist in controlling the robots to avoid transparency and reflected building obstacles is crucial for guaranteeing the adaptability and flexibility of mobile construction robots in complex construction environments. With the boom of computer vision techniques, a number of studies have proposed vision-based methods for transparent obstacle avoidance to improve operation accuracy. However, vision-based methods are also associated with disadvantages such as high computational costs. To provide better perception and value evaluation, this study aims to analyze the performance of vision-based techniques for avoiding transparent building obstacles. To achieve this, commonly used sensors, including a lidar, an ultrasonic sensor, and a USB camera, are equipped on the robotic platform to detect obstacles. A Raspberry Pi 3 computer board is employed to compute data collecting and control algorithms. The turtlebot3 burger is employed to test the programs. On-site experiments are carried out to observe the performance in terms of success rate and detection distance. Control variables include obstacle shapes and environmental conditions. The findings contribute to demonstrating how effectively vision-based obstacle avoidance strategies for transparent building obstacle avoidance and provide insights and informed knowledge when introducing computer vision techniques in the aforementioned domain.

Keywords: construction robot, obstacle avoidance, computer vision, transparent obstacle

Procedia PDF Downloads 78
2333 Chemical Fingerprinting of Complex Samples With the Aid of Parallel Outlet Flow Chromatography

Authors: Xavier A. Conlan

Abstract:

Speed of analysis is a significant limitation to current high-performance liquid chromatography/mass spectrometry (HPLC/MS) and ultra-high-pressure liquid chromatography (UHPLC)/MS systems both of which are used in many forensic investigations. The flow rate limitations of MS detection require a compromise in the chromatographic flow rate, which in turn reduces throughput, and when using modern columns, a reduction in separation efficiency. Commonly, this restriction is combated through the post-column splitting of flow prior to entry into the mass spectrometer. However, this results in a loss of sensitivity and a loss in efficiency due to the post-extra column dead volume. A new chromatographic column format known as 'parallel segmented flow' involves the splitting of eluent flow within the column outlet end fitting, and in this study we present its application in order to interrogate the provenience of methamphetamine samples with mass spectrometry detection. Using parallel segmented flow, column flow rates as high as 3 mL/min were employed in the analysis of amino acids without post-column splitting to the mass spectrometer. Furthermore, when parallel segmented flow chromatography columns were employed, the sensitivity was more than twice that of conventional systems with post-column splitting when the same volume of mobile phase was passed through the detector. These finding suggest that this type of column technology will particularly enhance the capabilities of modern LC/MS enabling both high-throughput and sensitive mass spectral detection.

Keywords: chromatography, mass spectrometry methamphetamine, parallel segmented outlet flow column, forensic sciences

Procedia PDF Downloads 486
2332 Genetics of Atopic Dermatitis: Role of Cytokine Genes Polymorphisms

Authors: Ghaleb Bin Huraib

Abstract:

Atopic dermatitis (AD), also known as atopic eczema, is a chronic inflammatory skin disease characterized by severe itching and recurrent, relapsing eczema-like skin lesions, affecting up to 20% of children and 10% of adults in industrialized countries. AD is a complex multifactorial disease, and its exact etiology and pathogenesis have not been fully elucidated. The aim of this study was to investigate the impact of gene polymorphisms of T helper cell subtype Th1 and Th2 cytokines, interferon-gamma (IFN-γ), interleukin-6 (IL-6) and transforming growth factor (TGF)-β1on AD susceptibility in a Saudi cohort. One hundred four unrelated patients with AD and 195 healthy controls were genotyped for IFN-γ (874A/T), IL-6 (174G/C) and TGF-β1 (509C/T) polymorphisms using ARMS-PCR and PCR-RFLP technique. The frequency of genotypes AA and AT of IFN-γ (874A/T) differed significantly among patients and controls (P 0.001). The genotype AT was increased while genotype AA was decreased in AD patients as compared to controls. AD patients also had a higher frequency of T-containing genotypes (AT+TT) than controls (P = 0.001). The frequencies of alleles T and A were statistically different in patients and controls (P = 0.04). The frequencies of genotype GG and allele G of IL-6 (174G/C) were significantly higher, while genotype GC and allele C were lower in AD patients than in controls. There was no significant difference in the frequencies of alleles and genotypes of TGF-β1 (509C/T) polymorphism between the patient and control groups. These results showed that susceptibility to AD is influenced by the presence or absence of genotypes of IFN-γ (874A/T) and IL-6 (174G/C) polymorphisms. It is concluded T-allele and T-containing genotypes (AT+TT) of IFN-γ (874A/T) and G-allele and GG genotype ofIL-6 (174G/C) polymorphisms are susceptible to AD in Saudis. On the other hand, the TGF-β1 (509C/T) polymorphism may not be associated with AD risk in our population; however, further studies with large sample sizes are required to confirm these results.

Keywords: atopic dermatitis, Polymorphism, Interferon, IL-6

Procedia PDF Downloads 65
2331 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 73