Search results for: customer information process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23834

Search results for: customer information process

18074 Digital Forensic Exploration Framework for Email and Instant Messaging Applications

Authors: T. Manesh, Abdalla A. Alameen, M. Mohemmed Sha, A. Mohamed Mustaq Ahmed

Abstract:

Email and instant messaging applications are foremost and extensively used electronic communication methods in this era of information explosion. These applications are generally used for exchange of information using several frontend applications from various service providers by its users. Almost all such communications are now secured using SSL or TLS security over HTTP communication. At the same time, it is also noted that cyber criminals and terrorists have started exchanging information using these methods. Since communication is encrypted end-to-end, tracing significant forensic details and actual content of messages are found to be unattended and severe challenges by available forensic tools. These challenges seriously affect in procuring substantial evidences against such criminals from their working environments. This paper presents a vibrant forensic exploration and architectural framework which not only decrypts any communication or network session but also reconstructs actual message contents of email as well as instant messaging applications. The framework can be effectively used in proxy servers and individual computers and it aims to perform forensic reconstruction followed by analysis of webmail and ICQ messaging applications. This forensic framework exhibits a versatile nature as it is equipped with high speed packet capturing hardware, a well-designed packet manipulating algorithm. It regenerates message contents over regular as well as SSL encrypted SMTP, POP3 and IMAP protocols and catalyzes forensic presentation procedure for prosecution of cyber criminals by producing solid evidences of their actual communication as per court of law of specific countries.

Keywords: forensics, network sessions, packet reconstruction, packet reordering

Procedia PDF Downloads 327
18073 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption

Procedia PDF Downloads 115
18072 Iron Recovery from Red Mud As Zero-Valent Iron Metal Powder Using Direct Electrochemical Reduction Method

Authors: Franky Michael Hamonangan Siagian, Affan Maulana, Himawan Tri Bayu Murti Petrus, Panut Mulyono, Widi Astuti

Abstract:

In this study, the feasibility of the direct electrowinning method was used to produce zero-valent iron from red mud. The bauxite residue sample came from the Tayan mine, Indonesia, which contains high hematite (Fe₂O₃). Before electrolysis, the samples were characterized by various analytical techniques (ICP-AES, SEM, XRD) to determine their chemical composition and mineralogy. The direct electrowinning method of red mud suspended in NaOH was introduced at low temperatures ranging from 30 - 110 °C. Variations of current density, red mud: NaOH ratio and temperature were carried out to determine the optimum operation of the direct electrowinning process. Cathode deposits and residues in electrochemical cells were analyzed using XRD, XRF, and SEM to determine the chemical composition and current recovery. The low-temperature electrolysis current efficiency on Redmud can reach 20% recovery at a current density of 920,945 A/m². The moderate performance of the process was investigated with red mud, which was attributed to the troublesome adsorption of red mud particles on the cathode, making the reduction far less efficient than that with hematite.

Keywords: alumina, red mud, electrochemical reduction, iron production

Procedia PDF Downloads 67
18071 Visualisation in Health Communication: Taking Weibo Interaction in COVD19 as the Example

Authors: Zicheng Zhang, Linli Zhang

Abstract:

As China's biggest social media platform, Weibo has taken on essential health communication responsibilities during the pandemic. This research takes 105 posters in 15 health-related official Weibo accounts as the analysis objects to explore COVID19 health information communication and visualisation. First, the interaction between the audiences and Weibo, including forwarding, comments, and likes, is statistically analysed. The comments about the information design are extracted manually, and then the sentiment analysis is carried out to verdict audiences' views about the poster's design. The forwarding and comments are quantified as the attention index for a reference to the degree of likes. In addition, this study also designed an evaluation scale based on the standards of Health Literacy Resource by the Centers for Medicare& Medicaid Services (US). Then designers scored all selected posters one by one. Finally, combining the data of the two parts, concluded that: 1. To a certain extent, people think that the posters do not deliver substantive and practical information; 2. Non-knowledge posters(i.e., cartoon posters) gained more Forwarding and Likes, such as Go, Wuhan poster; 3. The analysis of COVID posters is still mainly picture-oriented, mainly about encouraging people to overcome difficulties; 4. Posters for pandemic prevention usually contain more text and fewer illustrations and do not clearly show cultural differences. In conclusion, health communication usually involves a lot of professional knowledge, so visualising that knowledge in an accessible way for the general public is challenging. The relevant posters still have the problems of lack of effective communication, superficial design, and insufficient content accessibility.

Keywords: weibo, visualisation, covid posters, poster design

Procedia PDF Downloads 111
18070 An Automatic Bayesian Classification System for File Format Selection

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.

Keywords: data mining, digital libraries, digital preservation, file format

Procedia PDF Downloads 484
18069 Cost Effective Real-Time Image Processing Based Optical Mark Reader

Authors: Amit Kumar, Himanshu Singal, Arnav Bhavsar

Abstract:

In this modern era of automation, most of the academic exams and competitive exams are Multiple Choice Questions (MCQ). The responses of these MCQ based exams are recorded in the Optical Mark Reader (OMR) sheet. Evaluation of the OMR sheet requires separate specialized machines for scanning and marking. The sheets used by these machines are special and costs more than a normal sheet. Available process is non-economical and dependent on paper thickness, scanning quality, paper orientation, special hardware and customized software. This study tries to tackle the problem of evaluating the OMR sheet without any special hardware and making the whole process economical. We propose an image processing based algorithm which can be used to read and evaluate the scanned OMR sheets with no special hardware required. It will eliminate the use of special OMR sheet. Responses recorded in normal sheet is enough for evaluation. The proposed system takes care of color, brightness, rotation, little imperfections in the OMR sheet images.

Keywords: OMR, image processing, hough circle trans-form, interpolation, detection, binary thresholding

Procedia PDF Downloads 153
18068 Repurposing of Crystalline Solar PV For Sodium Silicate Production

Authors: Lawal Alkasim, Clement M. Gonah, Zainab S. Aliyu

Abstract:

This work is focus on recovering silicon form photovoltaic cells and repurposing it toward the use in glass, ceramics or glass ceramics as it is made up of silicon material. Silicon is the main back-bone and responsible for the thermodynamic properties of glass, ceramics and glass ceramics materials. Antireflection silicon is soluble in hot alkali. Successfully the recovered material composed of silicon and silicon nitride of the A.R, with a small amount of silver, Aluminuim, lead & copper in the sunshine of crystalline/non-crystalline silicon solar cell. Aquaregia is used to remove the silver, Aluminium, lead & copper. The recovered material treated with hot alkali highly concentrated to produce sodium silicate, which is an alkali silicate glass (water glass). This type of glass is produced through chemical process, unlike other glasses that are produced through physical process of melting and non-crystalline solidification. It has showed a property of being alkali silicate glass from its solubility in water and insoluble in alcohol. The XRF analysis shows the presence of sodium silicate.

Keywords: unrecyclable solar PV, crystalline silicon, hot conc. alkali, sodium silicate

Procedia PDF Downloads 81
18067 Future Design and Innovative Economic Models for Futuristic Markets in Developing Countries

Authors: Nessreen Y. Ibrahim

Abstract:

Designing the future according to realistic analytical study for the futuristic market needs can be a milestone strategy to make a huge improvement in developing countries economics. In developing countries, access to high technology and latest science approaches is very limited. The financial problems in low and medium income countries have negative effects on the kind and quality of imported new technologies and application for their markets. Thus, there is a strong need for shifting paradigm thinking in the design process to improve and evolve their development strategy. This paper discusses future possibilities in developing countries, and how they can design their own future according to specific future models FDM (Future Design Models), which established to solve certain economical problems, as well as political and cultural conflicts. FDM is strategic thinking framework provides an improvement in both content and process. The content includes; beliefs, values, mission, purpose, conceptual frameworks, research, and practice, while the process includes; design methodology, design systems, and design managements tools. In this paper the main objective was building an innovative economic model to design a chosen possible futuristic scenario; by understanding the market future needs, analyze real world setting, solve the model questions by future driven design, and finally interpret the results, to discuss to what extent the results can be transferred to the real world. The paper discusses Egypt as a potential case study. Since, Egypt has highly complex economical problems, extra-dynamic political factors, and very rich cultural aspects; we considered Egypt is a very challenging example for applying FDM. The paper results recommended using FDM numerical modeling as a starting point to design the future.

Keywords: developing countries, economic models, future design, possible futures

Procedia PDF Downloads 255
18066 Performance Measurement by Analytic Hierarchy Process in Performance Based Logistics

Authors: M. Hilmi Ozdemir, Gokhan Ozkan

Abstract:

Performance Based Logistics (PBL) is a strategic approach that enables creating long-term and win-win relations among stakeholders in the acquisition. Contrary to the traditional single transactions, the expected value is created by the performance of the service pertaining to the strategic relationships in this approach. PBL motivates all relevant stakeholders to focus on their core competencies to produce the desired outcome in a collective way. The desired outcome can only be assured with a cost effective way as long as it is periodically measured with the right performance parameters. Thus, defining these parameters is a crucial step for the PBL contracts. In performance parameter determination, Analytic Hierarchy Process (AHP), which is a multi-criteria decision making methodology for complex cases, was used within this study for a complex system. AHP has been extensively applied in various areas including supply chain, inventory management, outsourcing, and logistics. This methodology made it possible to convert end-user’s main operation and maintenance requirements to sub criteria contained by a single performance parameter. Those requirements were categorized and assigned weights by the relevant stakeholders. Single performance parameter capable of measuring the overall performance of a complex system is the major outcome of this study. The parameter deals with the integrated assessment of different functions spanning from training, operation, maintenance, reporting, and documentation that are implemented within a complex system. The aim of this study is to show the methodology and processes implemented to identify a single performance parameter for measuring the whole performance of a complex system within a PBL contract. AHP methodology is recommended as an option for the researches and the practitioners who seek for a lean and integrated approach for performance assessment within PBL contracts. The implementation of AHP methodology in this study may help PBL practitioners from methodological perception and add value to AHP in becoming prevalent.

Keywords: analytic hierarchy process, performance based logistics, performance measurement, performance parameters

Procedia PDF Downloads 270
18065 Case Study of Sexual Violence Victim Assessment in Semarang Regency

Authors: Sujana T, Kurniasari MD, Ayakeding AM

Abstract:

Background: Sexual violence is one of the violence with high incidence in Indonesia. Purpose: This research aims to describe the implementation of sexual violence victim assessment in Semarang Regency. Method: This research is a qualitative research with embeded single case study design. Data is analized with two units of analysis. The first unit of analysis is victim’s examiner with minimum one year of work experience. Semi-structured interview method is used to obtain the data. The second unit of analysis is document related. The data is taken by observing the pathway and description of every document and how it supported each implementation of assessment. Results: This study is resulted with three themes, which are: The first theme is assessments of sexual violence in Semarang regency has been standardized. The laws of the Republic of Indonesia have regulated the handling of victims of sexual violence in outline. Victims of sexual violence can be dealt with by the police, the Integrated Service Center for Women and Children Empowerment and the Regional General Hospital. Each examination site has different operational procedures standards for dealing with victims of sexual violence. Cooperation with family and witnesses is also required in the review process to obtain accurate results and evidence; The second idea that resulted from this study is there are inhibits factors in the assessments process. Victims sometimes feel embarrassed and reluctant to recount the chronological events during reporting. The examining officer should be able to approach and build a trust to convince the victim to be able to cooperate. The third theme is there are other things to consider in the process of assessing victims of sexual violence. Ensuring implementation in accordance with applicable operational procedures standards, providing exclusive examination rooms, counseling and safeguarding the privacy of victims are important to be considered in the assessment.

Keywords: assessment, case study, Semarang regency, sexual violence

Procedia PDF Downloads 128
18064 Structural Changes and Formation of Calcium Complexes in Corn Starch Processed by Nixtamalization

Authors: Arámbula-Villa Gerónimo, García-Lara Kenia Y., Figueroa-Cárdenas J. D., Pérez-Robles J. F., Jiménez-Sandoval S., Salazar-López R., Herrera-Corredor J. A.

Abstract:

The nixtamalization process (thermal-alkaline method) improves the nutritional part of the corn grain. In this process, the using of Ca(OH)₂ is basic, although the chemical mechanisms between this alkali and the carbohydrates (starch), proteins, lipids, and fiber have not been fully identified. In this study, the native corn starch was taken as a model, and it was subjected to cooking with different concentrations of lime (nixtamalization process) and specific studies of FTIR and XRD were carried out to identify the formation of chemical compounds, and the physical, physicochemical, rheological (paste) and structural properties of material obtained were determined. The FTIR spectra showed the formation of calcium-starch complexes. The treatments with Ca(OH)₂ showed a band shift towards 1675 cm⁻¹ and a band in 1436 cm⁻¹ (COO⁻), indicating the oxidation of starch. Three bands were identified (1575, 1550, and 1540 cm⁻¹) characteristics of carboxylic acid salts for three types of coordinated structures: monodentate, pseudo-bridged, and bidentate. The XRD spectra of starch treated with Ca(OH)₂ showed a peak corresponding to CaCO₃ (29.40°). The oxidation of starch was favored with low concentrations of Ca(OH)₂, producing carboxyl and carbonyl groups and increasing the residual CaCO₃. The increased concentration of Ca(OH)₂ showed the formation of calcium carboxylates, with a decrease in relative crystallinity and residual CaCO₃. Samples with low concentrations of Ca(OH)₂ slowed the onset of gelatinization and increased the swelling of the granules and the peak viscosity. The higher concentrations of Ca(OH)₂ difficulted the water absorption and decreased the viscosity rate and peak viscosity. These results can be used to improve the quality characteristics of the dough and tortillas and to get better acceptance by consumers.

Keywords: maize starch, nixtamalization, gelatinization, calcium carboxylates

Procedia PDF Downloads 79
18063 A Geographical Information System Supported Method for Determining Urban Transformation Areas in the Scope of Disaster Risks in Kocaeli

Authors: Tayfun Salihoğlu

Abstract:

Following the Law No: 6306 on Transformation of Disaster Risk Areas, urban transformation in Turkey found its legal basis. In the best practices all over the World, the urban transformation was shaped as part of comprehensive social programs through the discourses of renewing the economic, social and physical degraded parts of the city, producing spaces resistant to earthquakes and other possible disasters and creating a livable environment. In Turkish practice, a contradictory process is observed. In this study, it is aimed to develop a method for better understanding of the urban space in terms of disaster risks in order to constitute a basis for decisions in Kocaeli Urban Transformation Master Plan, which is being prepared by Kocaeli Metropolitan Municipality. The spatial unit used in the study is the 50x50 meter grids. In order to reflect the multidimensionality of urban transformation, three basic components that have spatial data in Kocaeli were identified. These components were named as 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings', and 'Inadequacy of Urban Services'. Each component was weighted and scored for each grid. In order to delimitate urban transformation zones Optimized Outlier Analysis (Local Moran I) in the ArcGIS 10.6.1 was conducted to test the type of distribution (clustered or scattered) and its significance on the grids by assuming the weighted total score of the grid as Input Features. As a result of this analysis, it was found that the weighted total scores were not significantly clustering at all grids in urban space. The grids which the input feature is clustered significantly were exported as the new database to use in further mappings. Total Score Map reflects the significant clusters in terms of weighted total scores of 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings' and 'Inadequacy of Urban Services'. Resulting grids with the highest scores are the most likely candidates for urban transformation in this citywide study. To categorize urban space in terms of urban transformation, Grouping Analysis in ArcGIS 10.6.1 was conducted to data that includes each component scores in significantly clustered grids. Due to Pseudo Statistics and Box Plots, 6 groups with the highest F stats were extracted. As a result of the mapping of the groups, it can be said that 6 groups can be interpreted in a more meaningful manner in relation to the urban space. The method presented in this study can be magnified due to the availability of more spatial data. By integrating with other data to be obtained during the planning process, this method can contribute to the continuation of research and decision-making processes of urban transformation master plans on a more consistent basis.

Keywords: urban transformation, GIS, disaster risk assessment, Kocaeli

Procedia PDF Downloads 109
18062 Macroalgae as a Gaseous Fuel Option: Potential and Advanced Conversion Technologies

Authors: Muhammad Rizwan Tabassum, Ao Xia, Jerry D. Murphy

Abstract:

The aim of this work is to provide an overview of macroalgae as an alternative feedstock for gaseous fuel production and key innovative technologies. Climate change and continuously depleting resources are the key driving forces to think for alternative sources of energy. Macroalgae can be favored over land based energy crops because they are not in direct competition with food crops. However, some drawbacks, such as high moisture content, seasonal variation in chemical composition and process inhibition limit the economic practicability. Macroalgae, like brown seaweed can be converted into gaseous and liquid fuel by different conversion technologies. Biomethane via anaerobic digestion is the appealing technology due to its dual advantage of a commercially applicable and environment friendly technology. Other technologies like biodiesel and bioethanol conversion technologies from seaweed are still under progress. Screening of high yielding macroalgae species, peak harvesting season and process optimization make the technology economically feasible for alternative source of feedstock for biofuel production in future.

Keywords: anaerobic digestion, biofuels, bio-methane, advanced conversion technologies, macroalgae

Procedia PDF Downloads 296
18061 Analysis of Citation Rate and Data Reuse for Openly Accessible Biodiversity Datasets on Global Biodiversity Information Facility

Authors: Nushrat Khan, Mike Thelwall, Kayvan Kousha

Abstract:

Making research data openly accessible has been mandated by most funders over the last 5 years as it promotes reproducibility in science and reduces duplication of effort to collect the same data. There are evidence that articles that publicly share research data have higher citation rates in biological and social sciences. However, how and whether shared data is being reused is not always intuitive as such information is not easily accessible from the majority of research data repositories. This study aims to understand the practice of data citation and how data is being reused over the years focusing on biodiversity since research data is frequently reused in this field. Metadata of 38,878 datasets including citation counts were collected through the Global Biodiversity Information Facility (GBIF) API for this purpose. GBIF was used as a data source since it provides citation count for datasets, not a commonly available feature for most repositories. Analysis of dataset types, citation counts, creation and update time of datasets suggests that citation rate varies for different types of datasets, where occurrence datasets that have more granular information have higher citation rates than checklist and metadata-only datasets. Another finding is that biodiversity datasets on GBIF are frequently updated, which is unique to this field. Majority of the datasets from the earliest year of 2007 were updated after 11 years, with no dataset that was not updated since creation. For each year between 2007 and 2017, we compared the correlations between update time and citation rate of four different types of datasets. While recent datasets do not show any correlations, 3 to 4 years old datasets show weak correlation where datasets that were updated more recently received high citations. The results are suggestive that it takes several years to cumulate citations for research datasets. However, this investigation found that when searched on Google Scholar or Scopus databases for the same datasets, the number of citations is often not the same as GBIF. Hence future aim is to further explore the citation count system adopted by GBIF to evaluate its reliability and whether it can be applicable to other fields of studies as well.

Keywords: data citation, data reuse, research data sharing, webometrics

Procedia PDF Downloads 164
18060 Sourcing and Compiling a Maltese Traffic Dataset MalTra

Authors: Gabriele Borg, Alexei De Bono, Charlie Abela

Abstract:

There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.

Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns

Procedia PDF Downloads 95
18059 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 331
18058 Bottleneck Modeling in Information Technology Service Management

Authors: Abhinay Puvvala, Veerendra Kumar Rai

Abstract:

A bottleneck situation arises when the outflow is lesser than the inflow in a pipe-like setup. A more practical interpretation of bottlenecks emphasizes on the realization of Service Level Objectives (SLOs) at given workloads. Our approach detects two key aspects of bottlenecks – when and where. To identify ‘when’ we continuously poll on certain key metrics such as resource utilization, processing time, request backlog and throughput at a system level. Further, when the slope of the expected sojourn time at a workload is greater than ‘K’ times the slope of expected sojourn time at the previous step of the workload while the workload is being gradually increased in discrete steps, a bottleneck situation arises. ‘K’ defines the threshold condition and is computed based on the system’s service level objectives. The second aspect of our approach is to identify the location of the bottleneck. In multi-tier systems with a complex network of layers, it is a challenging problem to locate bottleneck that affects the overall system performance. We stage the system by varying workload incrementally to draw a correlation between load increase and system performance to the point where Service Level Objectives are violated. During the staging process, multiple metrics are monitored at hardware and application levels. The correlations are drawn between metrics and the overall system performance. These correlations along with the Service Level Objectives are used to arrive at the threshold conditions for each of these metrics. Subsequently, the same method used to identify when a bottleneck occurs is used on metrics data with threshold conditions to locate bottlenecks.

Keywords: bottleneck, workload, service level objectives (SLOs), throughput, system performance

Procedia PDF Downloads 216
18057 Laser-Dicing Modeling: Implementation of a High Accuracy Tool for Laser-Grooving and Cutting Application

Authors: Jeff Moussodji, Dominique Drouin

Abstract:

The highly complex technology requirements of today’s integrated circuits (ICs), lead to the increased use of several materials types such as metal structures, brittle and porous low-k materials which are used in both front end of line (FEOL) and back end of line (BEOL) process for wafer manufacturing. In order to singulate chip from wafer, a critical laser-grooving process, prior to blade dicing, is used to remove these layers of materials out of the dicing street. The combination of laser-grooving and blade dicing allows to reduce the potential risk of induced mechanical defects such micro-cracks, chipping, on the wafer top surface where circuitry is located. It seems, therefore, essential to have a fundamental understanding of the physics involving laser-dicing in order to maximize control of these critical process and reduce their undesirable effects on process efficiency, quality, and reliability. In this paper, the study was based on the convergence of two approaches, numerical and experimental studies which allowed us to investigate the interaction of a nanosecond pulsed laser and BEOL wafer materials. To evaluate this interaction, several laser grooved samples were compared with finite element modeling, in which three different aspects; phase change, thermo-mechanical and optic sensitive parameters were considered. The mathematical model makes it possible to highlight a groove profile (depth, width, etc.) of a single pulse or multi-pulses on BEOL wafer material. Moreover, the heat affected zone, and thermo-mechanical stress can be also predicted as a function of laser operating parameters (power, frequency, spot size, defocus, speed, etc.). After modeling validation and calibration, a satisfying correlation between experiment and modeling, results have been observed in terms of groove depth, width and heat affected zone. The study proposed in this work is a first step toward implementing a quick assessment tool for design and debug of multiple laser grooving conditions with limited experiments on hardware in industrial application. More correlations and validation tests are in progress and will be included in the full paper.

Keywords: laser-dicing, nano-second pulsed laser, wafer multi-stack, multiphysics modeling

Procedia PDF Downloads 192
18056 Optimization of Beneficiation Process for Upgrading Low Grade Egyptian Kaolin

Authors: Nagui A. Abdel-Khalek, Khaled A. Selim, Ahmed Hamdy

Abstract:

Kaolin is naturally occurring ore predominantly containing kaolinite mineral in addition to some gangue minerals. Typical impurities present in kaolin ore are quartz, iron oxides, titanoferrous minerals, mica, feldspar, organic matter, etc. The main coloring impurity, particularly in the ultrafine size range, is titanoferrous minerals. Kaolin is used in many industrial applications such as sanitary ware, table ware, ceramic, paint, and paper industries, each of which should be of certain specifications. For most industrial applications, kaolin should be processed to obtain refined clay so as to match with standard specifications. For example, kaolin used in paper and paint industries need to be of high brightness and low yellowness. Egyptian kaolin is not subjected to any beneficiation process and the Egyptian companies apply selective mining followed by, in some localities, crushing and size reduction only. Such low quality kaolin can be used in refractory and pottery production but not in white ware and paper industries. This paper aims to study the amenability of beneficiation of an Egyptian kaolin ore of El-Teih locality, Sinai, to be suitable for different industrial applications. Attrition scrubbing and classification followed by magnetic separation are applied to remove the associated impurities. Attrition scrubbing and classification are used to separate the coarse silica and feldspars. Wet high intensity magnetic separation was applied to remove colored contaminants such as iron oxide and titanium oxide. Different variables affecting of magnetic separation process such as solid percent, magnetic field, matrix loading capacity, and retention time are studied. The results indicated that substantial decrease in iron oxide (from 1.69% to 0.61% ) and TiO2 (from 3.1% to 0.83%) contents as well as improving iso-brightness (from 63.76% to 75.21% and whiteness (from 79.85% to 86.72%) of the product can be achieved.

Keywords: Kaolin, titanoferrous minerals, beneficiation, magnetic separation, attrition scrubbing, classification

Procedia PDF Downloads 343
18055 Exploring the Relationship between Computerization and Marketing Performance Case Study: Snowa Company

Authors: Mojtaba Molaahmadi, Morteza Raei Dehaghi, Abdolrahim Arghavan

Abstract:

The present study aims to explore the effect of computerization on marketing performance in Snowa Company. In other words, this study intends to respond to this question that whether or not there is a relationship between utilization of computerization in marketing activities and marketing performance. The statistical population included 60 marketing managers of Snowa Company. In order to test the research hypotheses, Pearson correlation coefficient was employed. The reliability was equal to 96.8%. In this study, computerization was the independent variable and marketing performance was the dependent variable with characteristics of market share, improving the competitive position, and sales volume. The results of testing the hypotheses revealed that there is a significant relationship between utilization of computerization and market share, sales volume and improving the competitive position

Keywords: computerization, e-marketing information, information technology, marketing performance

Procedia PDF Downloads 317
18054 Dental Pathologies and Diet in Pre-hispanic Populations of the Equatorial Pacific Coast: Literature Review

Authors: Ricardo Andrés Márquez Ortiz

Abstract:

Objective. The objective of this literature review is to compile updated information from studies that have addressed the association between dental pathologies and diet in prehistoric populations of the equatorial Pacific coast. Materials and method. The research carried out corresponds to a documentary study of ex post facto retrospective, historiographic and bibliometric design. A bibliographic review search was carried out in the libraries of the Colombian Institute of Anthropology and History (ICANH) and the National University of Colombia for books and articles on the archeology of the region. In addition, a search was carried out in databases and the Internet for books and articles on dental anthropology, archeology and dentistry on the relationship between dental pathologies and diet in prehistoric and current populations from different parts of the world. Conclusions. The complex societies (500 BC - 300 AD) of the equatorial Pacific coast used an agricultural system of intensive monoculture of corn (Zea mays). This form of subsistence was reflected in an intensification of dental pathologies such as dental caries, dental abscesses generated by cavities, and enamel hypoplasia associated with a lower frequency of wear. The Upper Formative period (800 A.D. -16th century A.D.) is characterized by the development of polyculture, slash-and-burn agriculture, as an adaptive agricultural strategy to the ecological damage generated by the intensive economic activity of complex societies. This process leads to a more varied diet, which generates better dental health.

Keywords: dental pathologies, nutritional diet, equatorial pacific coast, dental anthropology

Procedia PDF Downloads 35
18053 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 205
18052 Comparative Isotherms Studies on Adsorptive Removal of Methyl Orange from Wastewater by Watermelon Rinds and Neem-Tree Leaves

Authors: Sadiq Sani, Muhammad B. Ibrahim

Abstract:

Watermelon rinds powder (WRP) and neem-tree leaves powder (NLP) were used as adsorbents for equilibrium adsorption isotherms studies for detoxification of methyl orange dye (MO) from simulated wastewater. The applicability of the process to various isotherm models was tested. All isotherms from the experimental data showed excellent linear reliability (R2: 0.9487-0.9992) but adsorptions onto WRP were more reliable (R2: 0.9724-0.9992) than onto NLP (R2: 0.9487-0.9989) except for Temkin’s Isotherm where reliability was better onto NLP (R2: 0.9937) than onto WRP (R2: 0.9935). Dubinin-Radushkevich’s monolayer adsorption capacities for both WRP and NLP (qD: 20.72 mg/g, 23.09 mg/g) were better than Langmuir’s (qm: 18.62 mg/g, 21.23 mg/g) with both capacities higher for adsorption onto NLP (qD: 23.09 mg/g; qm: 21.23 mg/g) than onto WRP (qD: 20.72 mg/g; qm: 18.62 mg/g). While values for Langmuir’s separation factor (RL) for both adsorbents suggested unfavourable adsorption processes (RL: -0.0461, -0.0250), Freundlich constant (nF) indicated favourable process onto both WRP (nF: 3.78) and NLP (nF: 5.47). Adsorption onto NLP had higher Dubinin-Radushkevich’s mean free energy of adsorption (E: 0.13 kJ/mol) than WRP (E: 0.08 kJ/mol) and Temkin’s heat of adsorption (bT) was better onto NLP (bT: -0.54 kJ/mol) than onto WRP (bT: -0.95 kJ/mol) all of which suggested physical adsorption.

Keywords: adsorption isotherms, methyl orange, neem leaves, watermelon rinds

Procedia PDF Downloads 253
18051 Modified Design of Flyer with Reduced Weight for Use in Textile Machinery

Authors: Payal Patel

Abstract:

Textile machinery is one of the fastest evolving areas which has an application of mechanical engineering. The modular approach towards the processing right from the stage of cotton to the fabric, allows us to observe the result of each process on its input. Cost and space being the major constraints. The flyer is a component of roving machine, which is used as a part of spinning process. In the present work using the application of Hyper Works, the flyer arm has been modified which saves the material used for manufacturing the flyer. The size optimization of the flyer is carried out with the objective of reduction in weight under the constraints of standard operating conditions. The new design of the flyer is proposed and validated using the module of HyperWorks which is equally strong, but light weighted compared to the existing design. Dynamic balancing of the optimized model is carried out to align a principal inertia axis with the geometric axis of rotation. For the balanced geometry of flyer, air resistance is obtained theoretically and with Gambit and Fluent. Static analysis of the balanced geometry has been done to verify the constraint of operating condition. Comparison of weight, deflection, and factor of safety has been made for different aluminum alloys.

Keywords: flyer, size optimization, textile, weight

Procedia PDF Downloads 196
18050 Vehicle Timing Motion Detection Based on Multi-Dimensional Dynamic Detection Network

Authors: Jia Li, Xing Wei, Yuchen Hong, Yang Lu

Abstract:

Detecting vehicle behavior has always been the focus of intelligent transportation, but with the explosive growth of the number of vehicles and the complexity of the road environment, the vehicle behavior videos captured by traditional surveillance have been unable to satisfy the study of vehicle behavior. The traditional method of manually labeling vehicle behavior is too time-consuming and labor-intensive, but the existing object detection and tracking algorithms have poor practicability and low behavioral location detection rate. This paper proposes a vehicle behavior detection algorithm based on the dual-stream convolution network and the multi-dimensional video dynamic detection network. In the videos, the straight-line behavior of the vehicle will default to the background behavior. The Changing lanes, turning and turning around are set as target behaviors. The purpose of this model is to automatically mark the target behavior of the vehicle from the untrimmed videos. First, the target behavior proposals in the long video are extracted through the dual-stream convolution network. The model uses a dual-stream convolutional network to generate a one-dimensional action score waveform, and then extract segments with scores above a given threshold M into preliminary vehicle behavior proposals. Second, the preliminary proposals are pruned and identified using the multi-dimensional video dynamic detection network. Referring to the hierarchical reinforcement learning, the multi-dimensional network includes a Timer module and a Spacer module, where the Timer module mines time information in the video stream and the Spacer module extracts spatial information in the video frame. The Timer and Spacer module are implemented by Long Short-Term Memory (LSTM) and start from an all-zero hidden state. The Timer module uses the Transformer mechanism to extract timing information from the video stream and extract features by linear mapping and other methods. Finally, the model fuses time information and spatial information and obtains the location and category of the behavior through the softmax layer. This paper uses recall and precision to measure the performance of the model. Extensive experiments show that based on the dataset of this paper, the proposed model has obvious advantages compared with the existing state-of-the-art behavior detection algorithms. When the Time Intersection over Union (TIoU) threshold is 0.5, the Average-Precision (MP) reaches 36.3% (the MP of baselines is 21.5%). In summary, this paper proposes a vehicle behavior detection model based on multi-dimensional dynamic detection network. This paper introduces spatial information and temporal information to extract vehicle behaviors in long videos. Experiments show that the proposed algorithm is advanced and accurate in-vehicle timing behavior detection. In the future, the focus will be on simultaneously detecting the timing behavior of multiple vehicles in complex traffic scenes (such as a busy street) while ensuring accuracy.

Keywords: vehicle behavior detection, convolutional neural network, long short-term memory, deep learning

Procedia PDF Downloads 114
18049 Thermo-Oxidative Degradation of Esterified Starch (with Lauric Acid) -Plastic Composite Assembled with Pro-Oxidants and Elastomers

Authors: R. M. S. Sachini Amararathne

Abstract:

This research is striving to develop a thermo degradable starch plastic compound/ masterbatch for industrial packaging applications. A native corn starch-modified with an esterification reaction of lauric acid is melt blent with an unsaturated elastomer (styrene-butadiene-rubber/styrene-butadiene-styrene). A trace amount of metal salt is added into the internal mixer to study the effect of pro-oxidants in a thermo oxidative environment. Then the granulated polymer composite which is consisted with 80-86% of polyolefin (LLDP/LDPE/PP) as the pivotal agent; is extruded with processing aids, antioxidants and some other additives in a co-rotating twin-screw extruder. The pelletized composite is subjected to compression molding/ Injection molding or blown film extrusion processes to acquire the samples/specimen for tests. The degradation process is explicated by analyzing the results of fourier transform infrared spectroscopy (FTIR) measurements, thermo oxidative aging studies (placing the dumb-bell specimen in an air oven at 70 °C for four weeks of exposure.) governed by tensile and impact strength test reports. Furthermore, the samples were elicited into manifold outdoors to inspect the degradation process. This industrial process is implemented to reduce the volume of fossil-based garbage by achieving the biodegradability and compostability in the natural cycle. Hence the research leads to manufacturing a degradable plastic packaging compound which is now available in the Sri Lankan market.

Keywords: blown film extrusion, compression moulding, polyolefin, pro-oxidant, styrene-butadine-rubber, styrene-butadiene-styrene, thermo oxidative aging, unsaturated elastomer

Procedia PDF Downloads 84
18048 Machine Learning for Disease Prediction Using Symptoms and X-Ray Images

Authors: Ravija Gunawardana, Banuka Athuraliya

Abstract:

Machine learning has emerged as a powerful tool for disease diagnosis and prediction. The use of machine learning algorithms has the potential to improve the accuracy of disease prediction, thereby enabling medical professionals to provide more effective and personalized treatments. This study focuses on developing a machine-learning model for disease prediction using symptoms and X-ray images. The importance of this study lies in its potential to assist medical professionals in accurately diagnosing diseases, thereby improving patient outcomes. Respiratory diseases are a significant cause of morbidity and mortality worldwide, and chest X-rays are commonly used in the diagnosis of these diseases. However, accurately interpreting X-ray images requires significant expertise and can be time-consuming, making it difficult to diagnose respiratory diseases in a timely manner. By incorporating machine learning algorithms, we can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The study utilized the Mask R-CNN algorithm, which is a state-of-the-art method for object detection and segmentation in images, to process chest X-ray images. The model was trained and tested on a large dataset of patient information, which included both symptom data and X-ray images. The performance of the model was evaluated using a range of metrics, including accuracy, precision, recall, and F1-score. The results showed that the model achieved an accuracy rate of over 90%, indicating that it was able to accurately detect and segment regions of interest in the X-ray images. In addition to X-ray images, the study also incorporated symptoms as input data for disease prediction. The study used three different classifiers, namely Random Forest, K-Nearest Neighbor and Support Vector Machine, to predict diseases based on symptoms. These classifiers were trained and tested using the same dataset of patient information as the X-ray model. The results showed promising accuracy rates for predicting diseases using symptoms, with the ensemble learning techniques significantly improving the accuracy of disease prediction. The study's findings indicate that the use of machine learning algorithms can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The model developed in this study has the potential to assist medical professionals in diagnosing respiratory diseases more accurately and efficiently. However, it is important to note that the accuracy of the model can be affected by several factors, including the quality of the X-ray images, the size of the dataset used for training, and the complexity of the disease being diagnosed. In conclusion, the study demonstrated the potential of machine learning algorithms for disease prediction using symptoms and X-ray images. The use of these algorithms can improve the accuracy of disease diagnosis, ultimately leading to better patient care. Further research is needed to validate the model's accuracy and effectiveness in a clinical setting and to expand its application to other diseases.

Keywords: K-nearest neighbor, mask R-CNN, random forest, support vector machine

Procedia PDF Downloads 125
18047 Design and Implementation a Virtualization Platform for Providing Smart Tourism Services

Authors: Nam Don Kim, Jungho Moon, Tae Yun Chung

Abstract:

This paper proposes an Internet of Things (IoT) based virtualization platform for providing smart tourism services. The virtualization platform provides a consistent access interface to various types of data by naming IoT devices and legacy information systems as pathnames in a virtual file system. In the other words, the IoT virtualization platform functions as a middleware which uses the metadata for underlying collected data. The proposed platform makes it easy to provide customized tourism information by using tourist locations collected by IoT devices and additionally enables to create new interactive smart tourism services focused on the tourist locations. The proposed platform is very efficient so that the provided tourism services are isolated from changes in raw data and the services can be modified or expanded without changing the underlying data structure.

Keywords: internet of things (IoT), IoT platform, serviceplatform, virtual file system (VSF)

Procedia PDF Downloads 488
18046 Meta-analysis of Technology Acceptance for Mobile and Digital Libraries in Academic Settings

Authors: Nosheen Fatima Warraich

Abstract:

One of the most often used models in information system (IS) research is the technology acceptance model (TAM). This meta-analysis aims to measure the relationship between TAM variables, Perceived Ease of Use (PEOU), and Perceived Usefulness (PU) with users’ attitudes and behavioral intention (BI) in mobile and digital libraries context. It also examines the relationship of external variables (information quality and system quality) with TAM variables (PEOU and PU) in digital libraries settings. This meta-analysis was performed through PRISMA-P guidelines. Four databases (Google Scholar, Web of Science, Scopus, and LISTA) were utilized for searching, and the search was conducted according to defined criteria. The findings of this study revealed a large effect size of PU and PEOU with BI. There was also a large effect size of PU and PEOU with attitude. A medium effect size was found between SysQ -> PU, InfoQ-> PU, and SysQ -> PEOU. However, there was a small effect size between InfoQ and PEOU. It fills the literature gap and also confirms that TAM is a valid model for the acceptance and use of technology in mobile and digital libraries context. Thus, its findings would be helpful for developers and designers in designing and developing mobile library apps. It will also be beneficial for library authorities and system librarians in designing and developing digital libraries in academic settings.

Keywords: technology acceptance model (tam), perceived ease of use, perceived usefulness, information quality, system quality, meta-analysis, systematic review, digital libraries, and mobile library apps.

Procedia PDF Downloads 55
18045 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 361