Search results for: heterogeneous massive data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25996

Search results for: heterogeneous massive data

24556 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 340
24555 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 444
24554 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 222
24553 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies

Authors: Tochukwu Timothy Okoli, Devi Datt Tewari

Abstract:

The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.

Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking

Procedia PDF Downloads 130
24552 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 532
24551 Quantification and Identification of the Main Components of the Biomass of the Microalgae Scenedesmus SP. – Prospection of Molecules of Commercial Interest

Authors: Carolina V. Viegas, Monique Gonçalves, Gisel Chenard Diaz, Yordanka Reyes Cruz, Donato Alexandre Gomes Aranda

Abstract:

To develop the massive cultivation of microalgae, it is necessary to isolate and characterize the species, improving genetic tools in search of specific characteristics. Therefore, the detection, identification and quantification of the compounds that compose the Scenedesmus sp. were prerequisites to verify the potential of these microalgae. The main objective of this work was to carry out the characterization of Scenedesmus sp. as to the content of ash, carbohydrates, proteins and lipids as well as the determination of the composition of their lipid classes and main fatty acids. The biomass of Scenedesmus sp, showed 15,29 ± 0,23 % of ash and CaO (36,17 %) was the main component of this fraction, The total protein and carbohydrate content of the biomass was 40,74 ± 1,01 % and 23,37 ± 0,95 %, respectively, proving to be a potential source of proteins as well as carbohydrates for the production of ethanol via fermentation, The lipid contents extracted via Bligh & Dyer and in situ saponification were 8,18 ± 0,13 % and 4,11 ± 0,11 %, respectively. In the lipid extracts obtained via Bligh & Dyer, approximately 50 % of the composition of this fraction consists of fatty compounds, while the other half is composed of an unsaponifiable fraction composed mainly of chlorophylls, phytosterols and carotenes. From the lowest yield, it was possible to obtain a selectivity of 92,14 % for fatty components (fatty acids and fatty esters) confirmed through the infrared spectroscopy technique. The presence of polyunsaturated acids (~45 %) in the lipid extracts indicated the potential of this fraction as a source of nutraceuticals. The results indicate that the biomass of Scenedesmus sp, can become a promising potential source for obtaining polyunsaturated fatty acids, carotenoids and proteins as well as the simultaneous obtainment of different compounds of high commercial value.

Keywords: microalgae, Desmodesmus, lipid classes, fatty acid profile, proteins, carbohydrates

Procedia PDF Downloads 97
24550 Uruguayan vs. British Press Coverage of a Political Kidnapping

Authors: Luisa Peirano

Abstract:

What began as a middle-class insurgent political movement whose slogan was 'Words divide us. Action unites us!' ultimately mutated into an underground terrorist group that staged a series of armed robberies, kidnappings and even executions in the 1960s and early 1970s. One of the most memorable was the kidnapping of the British ambassador, Sir Geoffrey Jackson, in January 1971, who was held captive for eight months. The episode, which triggered a massive government response and resulted in the capture of the Tupamaros leaders, continued to have political repercussions decades later when Tupamaros leaders emerged from prison to re-enter mainstream Uruguayan politics. The kidnapping and its aftermath attracted intense media coverage in Uruguay and Britain, coverage that affected public opinion profoundly. The treatment by the Uruguayan and British medias’ diverged, however. Uruguayan newspapers focused on political issues, mirrored the positions of various political parties, and showed the larger context of social, cultural and political forces that rocked Latin America in the 1960s and early 1970s. By contrast, the British press limited its attention mainly to the human drama. On the 30th anniversary of Sir Geoffrey Jackson's death, this study compares over one hundred major newspaper articles and suggests some reasons for the differences between Uruguayan and British media treatment in terms of the volume, content, and perspective as well in the effect on readers. The differences have persisted and continue to matter in present day coverage of terrorism and its victims.

Keywords: British Ambassador, Churchill Archives Centre, Sir Geoffrey Jackson, political kidnapping, Latin America in the 1960's, Tupamaro guerrillas, Uruguay

Procedia PDF Downloads 202
24549 Anonymous Gel-Fluid Transition of Solid Supported Lipids

Authors: Asma Poursoroush

Abstract:

Solid-supported lipid bilayers are often used as a simple model for studies of biological membranes. The presence of a solid substrate that interacts attractively with lipid head-groups is expected to affect the phase behavior of the supported bilayer. Molecular dynamics simulations of a coarse-grained model are thus performed to investigate the phase behavior of supported one-component lipid bilayer membranes. Our results show that the attraction of the lipid head groups to the substrate leads to a phase behavior that is different from that of a free standing lipid bilayer. In particular, we found that the phase behaviors of the two leaflets are decoupled in the presence of a substrate. The proximal leaflet undergoes a clear gel-to-fluid phase transition at a temperature lower than that of a free standing bilayer, and that decreases with increasing strength of the substrate-lipid attraction. The distal leaflet, however, undergoes a change from a homogeneous liquid phase at high temperatures to a heterogeneous state consisting of small liquid and gel domains, with the average size of the gel domains that increases with decreasing temperature. While the chain order parameter of the proximal leaflet clearly shows a gel-fluid phase transition, the chain order parameter of the distal leaflet does not exhibit a clear phase transition. The decoupling in the phase behavior of the two leaflets is due to a non-symmteric lipid distribution in the two leaflets resulting from the presence of the substrate.

Keywords: membrane, substrate, molecular dynamics, simulation

Procedia PDF Downloads 195
24548 Data Protection, Data Privacy, Research Ethics in Policy Process Towards Effective Urban Planning Practice for Smart Cities

Authors: Eugenio Ferrer Santiago

Abstract:

The growing complexities of the modern world on high-end gadgets, software applications, scams, identity theft, and Artificial Intelligence (AI) make the “uninformed” the weak and vulnerable to be victims of cybercrimes. Artificial Intelligence is not a new thing in our daily lives; the principles of database management, logical programming, and garbage in and garbage out are all connected to AI. The Philippines had in place legal safeguards against the abuse of cyberspace, but self-regulation of key industry players and self-protection by individuals are primordial to attain the success of these initiatives. Data protection, Data Privacy, and Research Ethics must work hand in hand during the policy process in the course of urban planning practice in different environments. This paper focuses on the interconnection of data protection, data privacy, and research ethics in coming up with clear-cut policies against perpetrators in the urban planning professional practice relevant in sustainable communities and smart cities. This paper shall use expository methodology under qualitative research using secondary data from related literature, interviews/blogs, and the World Wide Web resources. The claims and recommendations of this paper will help policymakers and implementers in the policy cycle. This paper shall contribute to the body of knowledge as a simple treatise and communication channel to the reading community and future researchers to validate the claims and start an intellectual discourse for better knowledge generation for the good of all in the near future.

Keywords: data privacy, data protection, urban planning, research ethics

Procedia PDF Downloads 60
24547 Review of the Road Crash Data Availability in Iraq

Authors: Abeer K. Jameel, Harry Evdorides

Abstract:

Iraq is a middle income country where the road safety issue is considered one of the leading causes of deaths. To control the road risk issue, the Iraqi Ministry of Planning, General Statistical Organization started to organise a collection system of traffic accidents data with details related to their causes and severity. These data are published as an annual report. In this paper, a review of the available crash data in Iraq will be presented. The available data represent the rate of accidents in aggregated level and classified according to their types, road users’ details, and crash severity, type of vehicles, causes and number of causalities. The review is according to the types of models used in road safety studies and research, and according to the required road safety data in the road constructions tasks. The available data are also compared with the road safety dataset published in the United Kingdom as an example of developed country. It is concluded that the data in Iraq are suitable for descriptive and exploratory models, aggregated level comparison analysis, and evaluation and monitoring the progress of the overall traffic safety performance. However, important traffic safety studies require disaggregated level of data and details related to the factors of the likelihood of traffic crashes. Some studies require spatial geographic details such as the location of the accidents which is essential in ranking the roads according to their level of safety, and name the most dangerous roads in Iraq which requires tactic plan to control this issue. Global Road safety agencies interested in solve this problem in low and middle-income countries have designed road safety assessment methodologies which are basing on the road attributes data only. Therefore, in this research it is recommended to use one of these methodologies.

Keywords: road safety, Iraq, crash data, road risk assessment, The International Road Assessment Program (iRAP)

Procedia PDF Downloads 256
24546 Significant Growth in Expected Muslim Inbound Tourists in Japan Towards 2020 Tokyo Olympic and Still Incipient Stage of Current Halal Implementations in Hiroshima

Authors: Kyoko Monden

Abstract:

Tourism has moved to the forefront of national attention in Japan since September of 2013 when Tokyo won its bid to host the 2020 Summer Olympics. The number of foreign tourists has continued to break records, reaching 13.4 million in 2014, and is now expected to hit 20 million sooner than initially targeted 2020 due to government stimulus promotions; an increase in low cost carriers; the weakening of the Japanese yen, and strong economic growth in Asia. The tourism industry can be an effective trigger in Japan’s economic recovery as foreign tourists spent two trillion yen ($16.6 million) in Japan in 2014. In addition, 81% of them were all from Asian countries, and it is essential to know that 68.9% of the world’s Muslims, about a billion people, live in South and Southeast Asia. An important question is ‘Do Muslim tourists feel comfortable traveling in Japan?’ This research was initiated by an encounter with Muslim visitors in Hiroshima, a popular international tourist destination, who said they had found very few suitable restaurants in Hiroshima. The purpose of this research is to examine halal implementation in Hiroshima and suggest the next steps to be taken to improve current efforts. The goal will be to provide anyone, Muslims included, with first class hospitality in the near future in preparation for the massive influx of foreign tourists in 2020. The methods of this research were questionnaires, face-to-face interviews, phone interviews, and internet research. First, this research aims to address the significance of growing inbound tourism in Japan, especially the expected growth in Muslim tourists. Additionally, it should address the strong popularity of eating Japanese foods in Asian Muslim countries and as ranked no. 1 thing foreign tourists want to do in Japan. Secondly, the current incipient stage of Hiroshima’s halal implementation at hotels, restaurants, and major public places were exposed, and the existing action plans by Hiroshima Prefecture Government were presented. Furthermore, two surveys were conducted to clarify basic halal awareness of local residents in Hiroshima, and to gauge the inconveniences Muslims living in Hiroshima faced. Thirdly, the reasons for this lapse were observed and compared to the benchmarking data of other major tourist sites, Hiroshima’s halal implementation plans were proposed. The conclusion is, despite increasing demands and interests in halal-friendly businesses, overall halal actions have barely been applied in Hiroshima. 76% of Hiroshima residents had no idea what halal or halaal meant. It is essential to increase halal awareness and its importance to the economy and to launch further actions to make Muslim tourists feel welcome in Hiroshima and the entire country.

Keywords: halaal, halal implementation, Hiroshima, inbound tourists in Japan

Procedia PDF Downloads 223
24545 The Image of Victim and Criminal in Love Crimes on Social Media in Egypt: Facebook Discourse Analysis

Authors: Sherehan Hamdalla

Abstract:

Egypt has experienced a series of terrifying love crimes in the last few months. This ‘trend’ of love crimes started with a young man caught on video slaughtering his ex-girlfriend in the street in the city of El Mansoura. The crime shocked all Egyptian citizens at all levels; unfortunately, not less than three similar crimes took place in other different Egyptian cities with the same killing trigger. The characteristics and easy access and reach of social media consider the reason why it is one of the most crucial online communication channels; users utilize social media platforms for sharing and exchanging ideas, news, and many other activities; they can freely share posts that reflect their mindset or personal views regarding any issues, these posts are going viral in all social media account by reposting or numbers of shares for these posts to support the content included, or even to attack. The repetition of sharing certain posts could mobilize other supporters with the same point of view, especially when that crowd’s online participation is confronting a public opinion case’s consequences. The death of that young woman was followed by similar crimes in other cities, such as El Sharkia and Port Said. These love crimes provoked a massive wave of contention among all social classes in Egypt. Strangely, some were supporting the criminal and defending his side for several reasons, which the study will uncover. Facebook, the most popular social media platform for Egyptians, reflects the debate between supporters of the victim and supporters of the criminal. Facebook pages were created specifically to disseminate certain viewpoints online, for example, asking for the maximum penalty to be given to criminals. These pages aimed to mobilize the maximum number of supporters and to affect the outcome of the trials.

Keywords: love crimes, victim, criminal, social media

Procedia PDF Downloads 76
24544 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method

Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens

Abstract:

Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.

Keywords: healthcare, knowledge acquisition, maximal data sets, action design science

Procedia PDF Downloads 361
24543 Tool for Metadata Extraction and Content Packaging as Endorsed in OAIS Framework

Authors: Payal Abichandani, Rishi Prakash, Paras Nath Barwal, B. K. Murthy

Abstract:

Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.

Keywords: digital preservation, metadata, OAIS, PDI, XML

Procedia PDF Downloads 393
24542 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
24541 Study of Aqueous Solutions: A Dielectric Spectroscopy Approach

Authors: Kumbharkhane Ashok

Abstract:

The time domain dielectric relaxation spectroscopy (TDRS) probes the interaction of a macroscopic sample with a time-dependent electrical field. The resulting complex permittivity spectrum, characterizes amplitude (voltage) and time scale of the charge-density fluctuations within the sample. These fluctuations may arise from the reorientation of the permanent dipole moments of individual molecules or from the rotation of dipolar moieties in flexible molecules, like polymers. The time scale of these fluctuations depends on the sample and its relative relaxation mechanism. Relaxation times range from some picoseconds in low viscosity liquids to hours in glasses, Therefore the DRS technique covers an extensive dynamical process, its corresponding frequency range from 10-4 Hz to 1012 Hz. This inherent ability to monitor the cooperative motion of molecular ensemble distinguishes dielectric relaxation from methods like NMR or Raman spectroscopy which yield information on the motions of individual molecules. An experimental set up for Time Domain Reflectometry (TDR) technique from 10 MHz to 30 GHz has been developed for the aqueous solutions. This technique has been very simple and covers a wide band of frequencies in the single measurement. Dielectric Relaxation Spectroscopy is especially sensitive to intermolecular interactions. The complex permittivity spectra of aqueous solutions have been fitted using Cole-Davidson (CD) model to determine static dielectric constants and relaxation times for entire concentrations. The heterogeneous molecular interactions in aqueous solutions have been discussed through Kirkwood correlation factor and excess properties.

Keywords: liquid, aqueous solutions, time domain reflectometry

Procedia PDF Downloads 444
24540 A Review on the Hydrologic and Hydraulic Performances in Low Impact Development-Best Management Practices Treatment Train

Authors: Fatin Khalida Abdul Khadir, Husna Takaijudin

Abstract:

Bioretention system is one of the alternatives to approach the conventional stormwater management, low impact development (LID) strategy for best management practices (BMPs). Incorporating both filtration and infiltration, initial research on bioretention systems has shown that this practice extensively decreases runoff volumes and peak flows. The LID-BMP treatment train is one of the latest LID-BMPs for stormwater treatments in urbanized watersheds. The treatment train is developed to overcome the drawbacks that arise from conventional LID-BMPs and aims to enhance the performance of the existing practices. In addition, it is also used to improve treatments in both water quality and water quantity controls as well as maintaining the natural hydrology of an area despite the current massive developments. The objective of this paper is to review the effectiveness of the conventional LID-BMPS on hydrologic and hydraulic performances through column studies in different configurations. The previous studies on the applications of LID-BMP treatment train that were developed to overcome the drawbacks of conventional LID-BMPs are reviewed and use as the guidelines for implementing this system in Universiti Teknologi Petronas (UTP) and elsewhere. The reviews on the analysis conducted for hydrologic and hydraulic performances using the artificial neural network (ANN) model are done in order to be utilized in this study. In this study, the role of the LID-BMP treatment train is tested by arranging bioretention cells in series in order to be implemented for controlling floods that occurred currently and in the future when the construction of the new buildings in UTP completed. A summary of the research findings on the performances of the system is provided which includes the proposed modifications on the designs.

Keywords: bioretention system, LID-BMP treatment train, hydrological and hydraulic performance, ANN analysis

Procedia PDF Downloads 118
24539 2D Numerical Modeling of Ultrasonic Measurements in Concrete: Wave Propagation in a Multiple-Scattering Medium

Authors: T. Yu, L. Audibert, J. F. Chaix, D. Komatitsch, V. Garnier, J. M. Henault

Abstract:

Linear Ultrasonic Techniques play a major role in Non-Destructive Evaluation (NDE) for civil engineering structures in concrete since they can meet operational requirements. Interpretation of ultrasonic measurements could be improved by a better understanding of ultrasonic wave propagation in a multiple scattering medium. This work aims to develop a 2D numerical model of ultrasonic wave propagation in a heterogeneous medium, like concrete, integrating the multiple scattering phenomena in SPECFEM software. The coherent field of multiple scattering is obtained by averaging numerical wave fields, and it is used to determine the effective phase velocity and attenuation corresponding to an equivalent homogeneous medium. First, this model is applied to one scattering element (a cylinder) in a homogenous medium in a linear-elastic system, and its validation is completed thanks to the comparison with analytical solution. Then, some cases of multiple scattering by a set of randomly located cylinders or polygons are simulated to perform parametric studies on the influence of frequency and scatterer size, concentration, and shape. Also, the effective properties are compared with the predictions of Waterman-Truell model to verify its validity. Finally, the mortar viscoelastic behavior is introduced in the simulation in order to considerer the dispersion and the attenuation due to porosity included in the cement paste. In the future, different steps will be developed: The comparisons with experimental results, the interpretation of NDE measurements, and the optimization of NDE parameters before an auscultation.

Keywords: attenuation, multiple-scattering medium, numerical modeling, phase velocity, ultrasonic measurements

Procedia PDF Downloads 275
24538 Addressing Supply Chain Data Risk with Data Security Assurance

Authors: Anna Fowler

Abstract:

When considering assets that may need protection, the mind begins to contemplate homes, cars, and investment funds. In most cases, the protection of those assets can be covered through security systems and insurance. Data is not the first thought that comes to mind that would need protection, even though data is at the core of most supply chain operations. It includes trade secrets, management of personal identifiable information (PII), and consumer data that can be used to enhance the overall experience. Data is considered a critical element of success for supply chains and should be one of the most critical areas to protect. In the supply chain industry, there are two major misconceptions about protecting data: (i) We do not manage or store confidential/personally identifiable information (PII). (ii) Reliance on Third-Party vendor security. These misconceptions can significantly derail organizational efforts to adequately protect data across environments. These statistics can be exciting yet overwhelming at the same time. The first misconception, “We do not manage or store confidential/personally identifiable information (PII)” is dangerous as it implies the organization does not have proper data literacy. Enterprise employees will zero in on the aspect of PII while neglecting trade secret theft and the complete breakdown of information sharing. To circumvent the first bullet point, the second bullet point forges an ideology that “Reliance on Third-Party vendor security” will absolve the company from security risk. Instead, third-party risk has grown over the last two years and is one of the major causes of data security breaches. It is important to understand that a holistic approach should be considered when protecting data which should not involve purchasing a Data Loss Prevention (DLP) tool. A tool is not a solution. To protect supply chain data, start by providing data literacy training to all employees and negotiating the security component of contracts with vendors to highlight data literacy training for individuals/teams that may access company data. It is also important to understand the origin of the data and its movement to include risk identification. Ensure processes effectively incorporate data security principles. Evaluate and select DLP solutions to address specific concerns/use cases in conjunction with data visibility. These approaches are part of a broader solutions framework called Data Security Assurance (DSA). The DSA Framework looks at all of the processes across the supply chain, including their corresponding architecture and workflows, employee data literacy, governance and controls, integration between third and fourth-party vendors, DLP as a solution concept, and policies related to data residency. Within cloud environments, this framework is crucial for the supply chain industry to avoid regulatory implications and third/fourth party risk.

Keywords: security by design, data security architecture, cybersecurity framework, data security assurance

Procedia PDF Downloads 89
24537 Data Security: An Enhancement of E-mail Security Algorithm to Secure Data Across State Owned Agencies

Authors: Lindelwa Mngomezulu, Tonderai Muchenje

Abstract:

Over the decades, E-mails provide easy, fast and timely communication enabling businesses and state owned agencies to communicate with their stakeholders and with their own employees in real-time. Moreover, since the launch of Microsoft office 365 and many other clouds based E-mail services, many businesses have been migrating from the on premises E-mail services to the cloud and more precisely since the beginning of the Covid-19 pandemic, there has been a significant increase of E-mails utilization, which then leads to the increase of cyber-attacks. In that regard, E-mail security has become very important in the E-mail transportation to ensure that the E-mail gets to the recipient without the data integrity being compromised. The classification of the features to enhance E-mail security for further from the enhanced cyber-attacks as we are aware that since the technology is advancing so at the cyber-attacks. Therefore, in order to maximize the data integrity we need to also maximize security of the E-mails such as enhanced E-mail authentication. The successful enhancement of E-mail security in the future may lessen the frequency of information thefts via E-mails, resulting in the data of South African State-owned agencies not being compromised.

Keywords: e-mail security, cyber-attacks, data integrity, authentication

Procedia PDF Downloads 136
24536 Semi-Supervised Outlier Detection Using a Generative and Adversary Framework

Authors: Jindong Gu, Matthias Schubert, Volker Tresp

Abstract:

In many outlier detection tasks, only training data belonging to one class, i.e., the positive class, is available. The task is then to predict a new data point as belonging either to the positive class or to the negative class, in which case the data point is considered an outlier. For this task, we propose a novel corrupted Generative Adversarial Network (CorGAN). In the adversarial process of training CorGAN, the Generator generates outlier samples for the negative class, and the Discriminator is trained to distinguish the positive training data from the generated negative data. The proposed framework is evaluated using an image dataset and a real-world network intrusion dataset. Our outlier-detection method achieves state-of-the-art performance on both tasks.

Keywords: one-class classification, outlier detection, generative adversary networks, semi-supervised learning

Procedia PDF Downloads 151
24535 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 125
24534 Development and Evaluation of a Portable Ammonia Gas Detector

Authors: Jaheon Gu, Wooyong Chung, Mijung Koo, Seonbok Lee, Gyoutae Park, Sangguk Ahn, Hiesik Kim, Jungil Park

Abstract:

In this paper, we present a portable ammonia gas detector for performing the gas safety management efficiently. The display of the detector is separated from its body. The display module is received the data measured from the detector using ZigBee. The detector has a rechargeable li-ion battery which can be use for 11~12 hours, and a Bluetooth module for sending the data to the PC or the smart devices. The data are sent to the server and can access using the web browser or mobile application. The range of the detection concentration is 0~100ppm.

Keywords: ammonia, detector, gas, portable

Procedia PDF Downloads 418
24533 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 235
24532 A Non-Invasive Blood Glucose Monitoring System Using near-Infrared Spectroscopy with Remote Data Logging

Authors: Bodhayan Nandi, Shubhajit Roy Chowdhury

Abstract:

This paper presents the development of a portable blood glucose monitoring device based on Near-Infrared Spectroscopy. The system supports Internet connectivity through WiFi and uploads the time series data of glucose concentration of patients to a server. In addition, the server is given sufficient intelligence to predict the future pathophysiological state of a patient given the current and past pathophysiological data. This will enable to prognosticate the approaching critical condition of the patient much before the critical condition actually occurs.The server hosts web applications to allow authorized users to monitor the data remotely.

Keywords: non invasive, blood glucose concentration, microcontroller, IoT, application server, database server

Procedia PDF Downloads 220
24531 Proposal to Increase the Efficiency, Reliability and Safety of the Centre of Data Collection Management and Their Evaluation Using Cluster Solutions

Authors: Martin Juhas, Bohuslava Juhasova, Igor Halenar, Andrej Elias

Abstract:

This article deals with the possibility of increasing efficiency, reliability and safety of the system for teledosimetric data collection management and their evaluation as a part of complex study for activity “Research of data collection, their measurement and evaluation with mobile and autonomous units” within project “Research of monitoring and evaluation of non-standard conditions in the area of nuclear power plants”. Possible weaknesses in existing system are identified. A study of available cluster solutions with possibility of their deploying to analysed system is presented.

Keywords: teledosimetric data, efficiency, reliability, safety, cluster solution

Procedia PDF Downloads 515
24530 Ultra-Reliable Low Latency V2X Communication for Express Way Using Multiuser Scheduling Algorithm

Authors: Vaishali D. Khairnar

Abstract:

The main aim is to provide lower-latency and highly reliable communication facilities for vehicles in the automobile industry; vehicle-to-everything (V2X) communication basically intends to increase expressway road security and its effectiveness. The Ultra-Reliable Low-Latency Communications (URLLC) algorithm and cellular networks are applied in combination with Mobile Broadband (MBB). This is particularly used in express way safety-based driving applications. Expressway vehicle drivers (humans) will communicate in V2X systems using the sixth-generation (6G) communication systems which have very high-speed mobility features. As a result, we need to determine how to ensure reliable and consistent wireless communication links and improve the quality to increase channel gain, which is becoming a challenge that needs to be addressed. To overcome this challenge, we proposed a unique multi-user scheduling algorithm for ultra-massive multiple-input multiple-output (MIMO) systems using 6G. In wideband wireless network access in case of high traffic and also in medium traffic conditions, moreover offering quality-of-service (QoS) to distinct service groups with synchronized contemporaneous traffic on the highway like the Mumbai-Pune expressway becomes a critical problem. Opportunist MAC (OMAC) is a way of proposing communication across a wireless communication link that can change in space and time and might overcome the above-mentioned challenge. Therefore, a multi-user scheduling algorithm is proposed for MIMO systems using a cross-layered MAC protocol to achieve URLLC and high reliability in V2X communication.

Keywords: ultra-reliable low latency communications, vehicle-to-everything communication, multiple-input multiple-output systems, multi-user scheduling algorithm

Procedia PDF Downloads 88
24529 Efficient Storage in Cloud Computing by Using Index Replica

Authors: Bharat Singh Deora, Sushma Satpute

Abstract:

Cloud computing is based on resource sharing. Like other resources which can be shareable, storage is a resource which can be shared. We can use collective resources of storage from different locations and maintain a central index table for storage details. The storage combining of different places can form a suitable data storage which is operated from one location and is very economical. Proper storage of data should improve data reliability & availability and bandwidth utilization. Also, we are moving the contents of one storage to other according to our need.

Keywords: cloud computing, cloud storage, Iaas, PaaS, SaaS

Procedia PDF Downloads 340
24528 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning

Authors: T. Bryan , V. Kepuska, I. Kostnaic

Abstract:

A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.

Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit

Procedia PDF Downloads 253
24527 Lead Removal From Ex- Mining Pond Water by Electrocoagulation: Kinetics, Isotherm, and Dynamic Studies

Authors: Kalu Uka Orji, Nasiman Sapari, Khamaruzaman W. Yusof

Abstract:

Exposure of galena (PbS), tealite (PbSnS2), and other associated minerals during mining activities release lead (Pb) and other heavy metals into the mining water through oxidation and dissolution. Heavy metal pollution has become an environmental challenge. Lead, for instance, can cause toxic effects to human health, including brain damage. Ex-mining pond water was reported to contain lead as high as 69.46 mg/L. Conventional treatment does not easily remove lead from water. A promising and emerging treatment technology for lead removal is the application of the electrocoagulation (EC) process. However, some of the problems associated with EC are systematic reactor design, selection of maximum EC operating parameters, scale-up, among others. This study investigated an EC process for the removal of lead from synthetic ex-mining pond water using a batch reactor and Fe electrodes. The effects of various operating parameters on lead removal efficiency were examined. The results obtained indicated that the maximum removal efficiency of 98.6% was achieved at an initial PH of 9, the current density of 15mA/cm2, electrode spacing of 0.3cm, treatment time of 60 minutes, Liquid Motion of Magnetic Stirring (LM-MS), and electrode arrangement = BP-S. The above experimental data were further modeled and optimized using a 2-Level 4-Factor Full Factorial design, a Response Surface Methodology (RSM). The four factors optimized were the current density, electrode spacing, electrode arrangements, and Liquid Motion Driving Mode (LM). Based on the regression model and the analysis of variance (ANOVA) at 0.01%, the results showed that an increase in current density and LM-MS increased the removal efficiency while the reverse was the case for electrode spacing. The model predicted the optimal lead removal efficiency of 99.962% with an electrode spacing of 0.38 cm alongside others. Applying the predicted parameters, the lead removal efficiency of 100% was actualized. The electrode and energy consumptions were 0.192kg/m3 and 2.56 kWh/m3 respectively. Meanwhile, the adsorption kinetic studies indicated that the overall lead adsorption system belongs to the pseudo-second-order kinetic model. The adsorption dynamics were also random, spontaneous, and endothermic. The higher temperature of the process enhances adsorption capacity. Furthermore, the adsorption isotherm fitted the Freundlish model more than the Langmuir model; describing the adsorption on a heterogeneous surface and showed good adsorption efficiency by the Fe electrodes. Adsorption of Pb2+ onto the Fe electrodes was a complex reaction, involving more than one mechanism. The overall results proved that EC is an efficient technique for lead removal from synthetic mining pond water. The findings of this study would have application in the scale-up of EC reactor and in the design of water treatment plants for feed-water sources that contain lead using the electrocoagulation method.

Keywords: ex-mining water, electrocoagulation, lead, adsorption kinetics

Procedia PDF Downloads 149