Search results for: inverse laplace transform techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8251

Search results for: inverse laplace transform techniques

5131 Political Manipulation in Global Discourse

Authors: Gohar Madoyan, Kristine Harutyunyan, Gevorg Barseghyan

Abstract:

It is common knowledge that linguistic manipulation is and has always been a powerful instrument of political discourse. Politicians from different countries and through centuries have successfully used linguistic means to persuade the public. Yet, this persuasion should be linguistically unobtrusive. Small changes in wording may result in a huge difference in perception by the audience. Thus, manipulation is a strategy that is mostly used to convey a certain message to the manipulators, who should be aware of the vulnerabilities of their audience and who must use them to achieve control. Political manipulation, though commonly observed in the 21st century, can easily be traced back to ancient rhetoric, which warns us to choose words carefully while addressing the audience. On the other hand, modern manipulative techniques have become more sophisticated, making use of all scientific advances.

Keywords: manipulators, politics, persuasion, political discourse, linguo-stylistic analysis, rhetoric

Procedia PDF Downloads 62
5130 Synthesis of ZnFe₂O₄-AC/CeMOF for Improvement Photodegradation of Textile Dyes Under Visible-light: Optimization and Statistical Study

Authors: Esraa Mohamed El-Fawal

Abstract:

A facile solvothermal procedure was applied to fabricate zinc ferrite nanoparticles (ZnFe₂O₄ NPs). Activated carbon (AC) derived from peanut shells is synthesized using a microwave through the chemical activation method. The ZnFe₂O₄-AC composite is then mixed with a cerium-based metal-organic framework (CeMOF) by solid-state adding to formulate ZnFe₂O₄-AC/CeMOF composite. The synthesized photo materials were tested by scanning/transmission electron microscope (SEM/TEM), Photoluminescence (PL), (XRD) X-Ray diffraction, (FTIR) Fourier transform infrared, (UV-Vis/DRS) ultraviolet-visible/diffuse reflectance spectroscopy. The prepared ZnFe₂O₄-AC/CeMOFphotomaterial shows significantly boosted efficiency for photodegradation of methyl orange /methylene blue (MO/MB) compared with the pristine ZnFe₂O₄ and ZnFe₂O₄-AC composite under the irradiation of visible-light. The favorable ZnFe₂O₄-AC/CeMOFphotocatalyst displays the highest photocatalytic degradation efficiency of MB/MO (R: 91.5-88.6%, consecutively) compared with the other as-prepared materials after 30 min of visible-light irradiation. The apparent reaction rate K: 1.94-1.31 min-1 is also calculated. The boosted photocatalytic proficiency is ascribed to the heterojunction at the interface of prepared photo material that assists the separation of the charge carriers. To reach optimization, statistical analysis using response surface methodology was applied. The effect of independent parameters (such as A (pH), B (irradiation time), and (c) initial pollutants concentration on the response function (%)photodegradation of MB/MO dyes (as examples of azodyes) was investigated via using central composite design. At the optimum condition, the photodegradation efficiency (%) of the MB/MO is 99.8-97.8%, respectively. ZnFe2O₄-AC/CeMOF hybrid reveals good stability over four consecutive cycles.

Keywords: azo-dyes, photo-catalysis, zinc ferrite, response surface methodology

Procedia PDF Downloads 148
5129 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 120
5128 Cold Spray Fabrication of Coating for Highly Corrosive Environment

Authors: Harminder Singh

Abstract:

Cold spray is a novel and emerging technology for the fabrication of coating. In this study, coating is successfully developed by this process on superalloy surface. The selected coating composition is already proved as corrosion resistant. The microstructure of the newly developed coating is examined by various characterization techniques, for testing its suitability for high temperature corrosive conditions of waste incinerator. The energy producing waste incinerators are still running at low efficiency, mainly due to their chlorine based highly corrosive conditions. The characterization results show that the developed cold sprayed coating structure is suitable for its further testing in highly aggressive conditions.

Keywords: coating, cold spray, corrosion, microstructure

Procedia PDF Downloads 380
5127 Long-Term Exposure, Health Risk, and Loss of Quality-Adjusted Life Expectancy Assessments for Vinyl Chloride Monomer Workers

Authors: Tzu-Ting Hu, Jung-Der Wang, Ming-Yeng Lin, Jin-Luh Chen, Perng-Jy Tsai

Abstract:

The vinyl chloride monomer (VCM) has been classified as group 1 (human) carcinogen by the IARC. Workers exposed to VCM are known associated with the development of the liver cancer and hence might cause economical and health losses. Particularly, for those work for the petrochemical industry have been seriously concerned in the environmental and occupational health field. Considering assessing workers’ health risks and their resultant economical and health losses requires the establishment of long-term VCM exposure data for any similar exposure group (SEG) of interest, the development of suitable technologies has become an urgent and important issue. In the present study, VCM exposures for petrochemical industry workers were determined firstly based on the database of the 'Workplace Environmental Monitoring Information Systems (WEMIS)' provided by Taiwan OSHA. Considering the existence of miss data, the reconstruction of historical exposure techniques were then used for completing the long-term exposure data for SEGs with routine operations. For SEGs with non-routine operations, exposure modeling techniques, together with their time/activity records, were adopted for determining their long-term exposure concentrations. The Bayesian decision analysis (BDA) was adopted for conducting exposure and health risk assessments for any given SEG in the petrochemical industry. The resultant excessive cancer risk was then used to determine the corresponding loss of quality-adjusted life expectancy (QALE). Results show that low average concentrations can be found for SEGs with routine operations (e.g., VCM rectification 0.0973 ppm, polymerization 0.306 ppm, reaction tank 0.33 ppm, VCM recovery 1.4 ppm, control room 0.14 ppm, VCM storage tanks 0.095 ppm and wastewater treatment 0.390 ppm), and the above values were much lower than that of the permissible exposure limit (PEL; 3 ppm) of VCM promulgated in Taiwan. For non-routine workers, though their high exposure concentrations, their low exposure time and frequencies result in low corresponding health risks. Through the consideration of exposure assessment results, health risk assessment results, and QALE results simultaneously, it is concluded that the proposed method was useful for prioritizing SEGs for conducting exposure abatement measurements. Particularly, the obtained QALE results further indicate the importance of reducing workers’ VCM exposures, though their exposures were low as in comparison with the PEL and the acceptable health risk.

Keywords: exposure assessment, health risk assessment, petrochemical industry, quality-adjusted life years, vinyl chloride monomer

Procedia PDF Downloads 176
5126 Phytobeds with Fimbristylis dichotoma and Ammannia baccifera for Treatment of Real Textile Effluent: An in situ Treatment, Anatomical Studies and Toxicity Evaluation

Authors: Suhas Kadam, Vishal Chandanshive, Niraj Rane, Sanjay Govindwar

Abstract:

Fimbristylis dichotoma, Ammannia baccifera, and their co-plantation consortium FA were found to degrade methyl orange, simulated dye mixture, and real textile effluent. Wild plants of Fimbristylis dichotoma and Ammannia baccifera with equal biomass showed 91 and 89% decolorization of methyl orange within 60 h at a concentration of 50 ppm, while 95% dye removal was achieved by consortium FA within 48 h. Floating phyto-beds with co-plantation (Fimbristylis dichotoma and Ammannia baccifera) for the treatment of real textile effluent in a constructed wetland was observed to be more efficient and achieved 79, 72, 77, 66 and 56% reductions in ADMI color value, chemical oxygen demand, biological oxygen demand, total dissolve solid and total suspended solid of textile effluent, respectively. High performance thin layer chromatography, gas chromatography-mass spectroscopy, Fourier transform infrared spectroscopy, Ultra violet-Visible spectroscopy and enzymatic assays confirmed the phytotransformation of parent dye in the new metabolites. T-RFLP analysis of rhizospheric bacteria of Fimbristylis dichotoma, Ammannia baccifera, and consortium FA revealed the presence of 88, 98 and 223 genera which could have been involved in dye removal. Toxicity evaluation of products formed after phytotransformation of methyl orange by consortium FA on bivalves Lamellidens marginalis revealed less damage in the gills architecture when analyzed histologically. Toxicity measurement by Random Amplification of Polymorphic DNA (RAPD) technique revealed normal banding pattern in treated methyl orange sample suggesting less toxic nature of phytotransformed dye products.

Keywords: constructed wetland, phyto-bed, textile effluent, phytoremediation

Procedia PDF Downloads 469
5125 Towards Creative Movie Title Generation Using Deep Neural Models

Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie

Abstract:

Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.

Keywords: creativity, deep machine learning, natural language generation, movies

Procedia PDF Downloads 311
5124 The Relationship Between Military Expenditure and International Trade: A Selection of African Countries

Authors: Andre C Jordaan

Abstract:

The end of the Cold War and rivalry between super powers has changed the nature of military build-up in many countries. A call from international institutions like the United Nations, International Monetary Fund and the World Bank to reduce the levels of military expenditure was the order of the day. However, this bid to cut military expenditure has not been forthright. Recently, active armed conflicts occurred in at least 46 states in 2021 with 8 in the Americas, 9 in Asia and Oceania, 3 in Europe, 8 in the Middle East and North Africa and 18 in sub-Saharan Africa. Global military expenditure in 2022 was estimated to be US$2,2 trillion, representing 2.2 per cent of global gross domestic product. Particularly sharp rises in military spending have followed in African countries and the Middle East. Global military expenditure currently follows two divergent trends, either a declining trend in the West caused mainly by austerity, efforts to control budget deficits and the wrapping up of prolonged wars. However, some parts of the world shows an increasing trend on the back of security concerns, geopolitical ambitions and some internal political factors. Conflict related fatalities in sub-Saharan Africa alone increased by 19 per cent between 2020 and 2021. The interaction between military expenditure (read conflict) and international trade is generally the cause of much debate. Some argue that countries’ fear of losing trade opportunities causes political decision makers to refrain from engaging in conflict when important trading partners are involved. However, three main arguments are always present when discussing the relationship between military expenditure or conflicts and international trade: Free trade could promote peaceful cooperation, it could trigger tension between trading blocs and partners, and trade could have no effect because conflict is based on issues that are more important. Military expenditure remains an important element of the overall government expenditure in many African countries. On the other hand, numerous researchers perceive increased international trade to be one of the main factors promoting economic growth in these countries. The purpose of this paper is therefore to determine what effect, if any, exist between the level of military expenditure and international trade within a selection of 19 African countries. Applying an augmented gravity model to explore the relationship between military expenditure and international trade, evidence is found to confirm the existence of an inverse relationship between these two variables. It seems that the results are in line with the Liberal school of thought where trade is seen as an instrument of conflict prevention. Trade is therefore perceived as a symptom of peace and not a cause thereof. In general, conflict or rumors of conflict tend to reduce trade. If conflict did not impede trade, economic agents would be indifferent to risk. Many claim that trade brings peace, however, it seems that it is rather peace that brings trade. From the results, it appears that trade reduces the risk of conflict and that conflict reduces trade.

Keywords: African countries, conflict, international trade, military expenditure

Procedia PDF Downloads 53
5123 Double Clustering as an Unsupervised Approach for Order Picking of Distributed Warehouses

Authors: Hsin-Yi Huang, Ming-Sheng Liu, Jiun-Yan Shiau

Abstract:

Planning the order picking lists of warehouses to achieve when the costs associated with logistics on the operational performance is a significant challenge. In e-commerce era, this task is especially important productive processes are high. Nowadays, many order planning techniques employ supervised machine learning algorithms. However, the definition of which features should be processed by such algorithms is not a simple task, being crucial to the proposed technique’s success. Against this background, we consider whether unsupervised algorithms can enhance the planning of order-picking lists. A Zone2 picking approach, which is based on using clustering algorithms twice, is developed. A simplified example is given to demonstrate the merit of our approach.

Keywords: order picking, warehouse, clustering, unsupervised learning

Procedia PDF Downloads 144
5122 Integration of Magnetoresistance Sensor in Microfluidic Chip for Magnetic Particles Detection

Authors: Chao-Ming Su, Pei-Sheng Wu, Yu-Chi Kuo, Yin-Chou Huang, Tan-Yueh Chen, Jefunnie Matahum, Tzong-Rong Ger

Abstract:

Application of magnetic particles (MPs) has been applied in biomedical field for many years. There are lots of advantages through this mediator including high biocompatibility and multi-diversified bio-applications. However, current techniques for evaluating the quantity of the magnetic-labeled sample assays are rare. In this paper, a Wheatstone bridge giant magnetoresistance (GMR) sensor integrated with a homemade detecting system was fabricated and used to quantify the concentration of MPs. The homemade detecting system has shown high detecting sensitivity of 10 μg/μl of MPs with optimized parameter vertical magnetic field 100 G, horizontal magnetic field 2 G and flow rate 0.4 ml/min.

Keywords: magnetic particles, magnetoresistive sensors, microfluidics, biosensor

Procedia PDF Downloads 386
5121 Urdu Text Extraction Method from Images

Authors: Samabia Tehsin, Sumaira Kausar

Abstract:

Due to the vast increase in the multimedia data in recent years, efficient and robust retrieval techniques are needed to retrieve and index images/ videos. Text embedded in the images can serve as the strong retrieval tool for images. This is the reason that text extraction is an area of research with increasing attention. English text extraction is the focus of many researchers but very less work has been done on other languages like Urdu. This paper is focusing on Urdu text extraction from video frames. This paper presents a text detection feature set, which has the ability to deal up with most of the problems connected with the text extraction process. To test the validity of the method, it is tested on Urdu news dataset, which gives promising results.

Keywords: caption text, content-based image retrieval, document analysis, text extraction

Procedia PDF Downloads 492
5120 Phytochemical Composition and Characterization of Bioactive Compounds of the Green Seaweed Ulva lactuca: A Phytotherapeutic Approach

Authors: Mariame Taibi, Marouane Aouiji, Rachid Bengueddour

Abstract:

The Moroccan coastline is particularly rich in algae and constitutes a reserve of species with considerable economic, social and ecological potential. This work focuses on the research and characterization of algae bioactive compounds that can be used in pharmacology or phytopathology. The biochemical composition of the green alga Ulva lactuca (Ulvophyceae) was studied by determining the content of moisture, ash, phenols, flavonoids, total tannins, and chlorophyll. Seven solvents: distilled water, methanol, ethyl acetate, chloroform, benzene, petroleum ether, and hexane, were tested for their effectiveness in recovering chemical compounds. The identification of functional groupings, as well as the bioactive chemical compounds, was determined by FT-IR and GC-MS. The moisture content of the alga was 77%, while the ash content was 15%. Phenol content differed from one solvent studied to another, while chlorophyll a, b, and total chlorophyll were determined at 14%, 9.52%, and 25%, respectively. Carotenoid was present in a considerable amount (8.17%). The experimental results show that methanol is the most effective solvent for recovering bioactive compounds, followed by water. Moreover, the green alga Ulva lactuca is characterized by a high level of total polyphenols (45±3.24 mg GAE/gDM), average levels of total tannins and flavonoids (22.52±8.23 mg CE/gDM, 15.49±0.064 mg QE/gDM) respectively. The results of Fourier transform infrared spectroscopy (FT-IR) confirmed the presence of alcohol/phenol and amide functions in Ulva lactuca. The GC-MS analysis gave precisely the compounds contained in the various extracts, such as phenolic compounds, fatty acids, terpenoids, alcohols, alkanes, hydrocarbons, and steroids. All these results represent only a first step in the search for biologically active natural substances from seaweed. Additional tests are envisaged to confirm the bioactivity of seaweed.

Keywords: algae, Ulva lactuca, phenolic compounds, FTIR, GC-MS

Procedia PDF Downloads 93
5119 Practical Techniques of Improving State Estimator Solution

Authors: Kiamran Radjabli

Abstract:

State Estimator became an intrinsic part of Energy Management Systems (EMS). The SCADA measurements received from the field are processed by the State Estimator in order to accurately determine the actual operating state of the power systems and provide that information to other real-time network applications. All EMS vendors offer a State Estimator functionality in their baseline products. However, setting up and ensuring that State Estimator consistently produces a reliable solution often consumes a substantial engineering effort. This paper provides generic recommendations and describes a simple practical approach to efficient tuning of State Estimator, based on the working experience with major EMS software platforms and consulting projects in many electrical utilities of the USA.

Keywords: convergence, monitoring, state estimator, performance, troubleshooting, tuning, power systems

Procedia PDF Downloads 146
5118 Customer Satisfaction on Reliability Dimension of Service Quality in Indian Higher Education

Authors: Rajasekhar Mamilla, G. Janardhana, G. Anjan Babu

Abstract:

The present research studies analyses the students’ satisfaction with university performance regarding the reliability dimension, ability of professors and staff to perform the promised services with quality to students in the post-graduate courses offered by Sri Venkateswara University in India. The research is done with the notion that the student compares the perceived performance with prior expectations. Customer satisfaction is seen as the outcome of this comparison. The sample respondents were administered with the schedule based on the stratified random technique for this study. Statistical techniques such as factor analysis, t-test and correlation analysis were used to accomplish the respective objectives of the study.

Keywords: satisfaction, reliability, service quality, customer

Procedia PDF Downloads 536
5117 The Study of the Determinants of Impulse Buying in Algeria

Authors: Amina Merabet, Ali Iznasni, Abderrezzak Benhabib

Abstract:

Impulse buying is of strategic importance to distributors. Currently, distribution companies rely heavily on contextual variables (music, smells, colors, sound, design ...) in order to push customers towards purchase and consumption. As such, a crucial way for commercial brands to increase sales is to stimulate impulse buying. For this reason, this study aims at identifying the factors that initiate and encourage impulse buying, as well as the levers that help distributors highlight effective marketing techniques in order to encourage consumers to make impulse purchase. Thus, we try to show, upon a field survey of 590 buyers, the impact of situational elements of both the store and the product on achieving impulse buying.

Keywords: Algerian shoppers, impulse buying, shopping environment, situational variables, product

Procedia PDF Downloads 338
5116 Image Segmentation with Deep Learning of Prostate Cancer Bone Metastases on Computed Tomography

Authors: Joseph M. Rich, Vinay A. Duddalwar, Assad A. Oberai

Abstract:

Prostate adenocarcinoma is the most common cancer in males, with osseous metastases as the commonest site of metastatic prostate carcinoma (mPC). Treatment monitoring is based on the evaluation and characterization of lesions on multiple imaging studies, including Computed Tomography (CT). Monitoring of the osseous disease burden, including follow-up of lesions and identification and characterization of new lesions, is a laborious task for radiologists. Deep learning algorithms are increasingly used to perform tasks such as identification and segmentation for osseous metastatic disease and provide accurate information regarding metastatic burden. Here, nnUNet was used to produce a model which can segment CT scan images of prostate adenocarcinoma vertebral bone metastatic lesions. nnUNet is an open-source Python package that adds optimizations to deep learning-based UNet architecture but has not been extensively combined with transfer learning techniques due to the absence of a readily available functionality of this method. The IRB-approved study data set includes imaging studies from patients with mPC who were enrolled in clinical trials at the University of Southern California (USC) Health Science Campus and Los Angeles County (LAC)/USC medical center. Manual segmentation of metastatic lesions was completed by an expert radiologist Dr. Vinay Duddalwar (20+ years in radiology and oncologic imaging), to serve as ground truths for the automated segmentation. Despite nnUNet’s success on some medical segmentation tasks, it only produced an average Dice Similarity Coefficient (DSC) of 0.31 on the USC dataset. DSC results fell in a bimodal distribution, with most scores falling either over 0.66 (reasonably accurate) or at 0 (no lesion detected). Applying more aggressive data augmentation techniques dropped the DSC to 0.15, and reducing the number of epochs reduced the DSC to below 0.1. Datasets have been identified for transfer learning, which involve balancing between size and similarity of the dataset. Identified datasets include the Pancreas data from the Medical Segmentation Decathlon, Pelvic Reference Data, and CT volumes with multiple organ segmentations (CT-ORG). Some of the challenges of producing an accurate model from the USC dataset include small dataset size (115 images), 2D data (as nnUNet generally performs better on 3D data), and the limited amount of public data capturing annotated CT images of bone lesions. Optimizations and improvements will be made by applying transfer learning and generative methods, including incorporating generative adversarial networks and diffusion models in order to augment the dataset. Performance with different libraries, including MONAI and custom architectures with Pytorch, will be compared. In the future, molecular correlations will be tracked with radiologic features for the purpose of multimodal composite biomarker identification. Once validated, these models will be incorporated into evaluation workflows to optimize radiologist evaluation. Our work demonstrates the challenges of applying automated image segmentation to small medical datasets and lays a foundation for techniques to improve performance. As machine learning models become increasingly incorporated into the workflow of radiologists, these findings will help improve the speed and accuracy of vertebral metastatic lesions detection.

Keywords: deep learning, image segmentation, medicine, nnUNet, prostate carcinoma, radiomics

Procedia PDF Downloads 81
5115 Intelligent Driver Safety System Using Fatigue Detection

Authors: Samra Naz, Aneeqa Ahmed, Qurat-ul-ain Mubarak, Irum Nausheen

Abstract:

Driver safety systems protect driver from accidents by sensing signs of drowsiness. The paper proposes a technique which can detect the signs of drowsiness and make corresponding decisions to make the driver alert. This paper presents a technique in which the driver will be continuously monitored by a camera and his eyes, head and mouth movements will be observed. If the drowsiness signs are detected on the basis of these three movements under the predefined criteria, driver will be declared as sleepy and he will get alert with the help of alarms. Three robust techniques of drowsiness detection are combined together to make a robust system that can prevent form accident.

Keywords: drowsiness, eye closure, fatigue detection, yawn detection

Procedia PDF Downloads 281
5114 The Possibility of Solving a 3x3 Rubik’s Cube under 3 Seconds

Authors: Chung To Kong, Siu Ming Yiu

Abstract:

Rubik's cube was invented in 1974. Since then, speedcubers all over the world try their best to break the world record again and again. The newest record is 3.47 seconds. There are many factors that affect the timing, including turns per second (tps), algorithm, finger trick, hardware of the cube. In this paper, the lower bound of the cube solving time will be discussed using convex optimization. Extended analysis of the world records will be used to understand how to improve the timing. With the understanding of each part of the solving step, the paper suggests a list of speed improvement techniques. Based on the analysis of the world record, there is a high possibility that the 3 seconds mark will be broken soon.

Keywords: Rubik's Cube, speed, finger trick, optimization

Procedia PDF Downloads 190
5113 Visualization Tool for EEG Signal Segmentation

Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh

Abstract:

This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.

Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation

Procedia PDF Downloads 380
5112 Mechanical Characterization and CNC Rotary Ultrasonic Grinding of Crystal Glass

Authors: Ricardo Torcato, Helder Morais

Abstract:

The manufacture of crystal glass parts is based on obtaining the rough geometry by blowing and/or injection, generally followed by a set of manual finishing operations using cutting and grinding tools. The forming techniques used do not allow the obtainment, with repeatability, of parts with complex shapes and the finishing operations use intensive specialized labor resulting in high cycle times and production costs. This work aims to explore the digital manufacture of crystal glass parts by investigating new subtractive techniques for the automated, flexible finishing of these parts. Finishing operations are essential to respond to customer demands in terms of crystal feel and shine. It is intended to investigate the applicability of different computerized finishing technologies, namely milling and grinding in a CNC machining center with or without ultrasonic assistance, to crystal processing. Research in the field of grinding hard and brittle materials, despite not being extensive, has increased in recent years, and scientific knowledge about the machinability of crystal glass is still very limited. However, it can be said that the unique properties of glass, such as high hardness and very low toughness, make any glass machining technology a very challenging process. This work will measure the performance improvement brought about by the use of ultrasound compared to conventional crystal grinding. This presentation is focused on the mechanical characterization and analysis of the cutting forces in CNC machining of superior crystal glass (Pb ≥ 30%). For the mechanical characterization, the Vickers hardness test provides an estimate of the material hardness (Hv) and the fracture toughness based on cracks that appear in the indentation. Mechanical impulse excitation test estimates the Young’s Modulus, shear modulus and Poisson ratio of the material. For the cutting forces, it a dynamometer was used to measure the forces in the face grinding process. The tests were made based on the Taguchi method to correlate the input parameters (feed rate, tool rotation speed and depth of cut) with the output parameters (surface roughness and cutting forces) to optimize the process (better roughness using the cutting forces that do not compromise the material structure and the tool life) using ANOVA. This study was conducted for conventional grinding and for the ultrasonic grinding process with the same cutting tools. It was possible to determine the optimum cutting parameters for minimum cutting forces and for minimum surface roughness in both grinding processes. Ultrasonic-assisted grinding provides a better surface roughness than conventional grinding.

Keywords: CNC machining, crystal glass, cutting forces, hardness

Procedia PDF Downloads 139
5111 Characterization of Thin Woven Composites Used in Printed Circuit Boards by Combining Numerical and Experimental Approaches

Authors: Gautier Girard, Marion Martiny, Sebastien Mercier, Mohamad Jrad, Mohamed-Slim Bahi, Laurent Bodin, Francois Lechleiter, David Nevo, Sophie Dareys

Abstract:

Reliability of electronic devices has always been of highest interest for Aero-MIL and space applications. In any electronic device, Printed Circuit Board (PCB), providing interconnection between components, is a key for reliability. During the last decades, PCB technologies evolved to sustain and/or fulfill increased original equipment manufacturers requirements and specifications, higher densities and better performances, faster time to market and longer lifetime, newer material and mixed buildups. From the very beginning of the PCB industry up to recently, qualification, experiments and trials, and errors were the most popular methods to assess system (PCB) reliability. Nowadays OEM, PCB manufacturers and scientists are working together in a close relationship in order to develop predictive models for PCB reliability and lifetime. To achieve that goal, it is fundamental to characterize precisely base materials (laminates, electrolytic copper, …), in order to understand failure mechanisms and simulate PCB aging under environmental constraints by means of finite element method for example. The laminates are woven composites and have thus an orthotropic behaviour. The in-plane properties can be measured by combining classical uniaxial testing and digital image correlation. Nevertheless, the out-of-plane properties cannot be evaluated due to the thickness of the laminate (a few hundred of microns). It has to be noted that the knowledge of the out-of-plane properties is fundamental to investigate the lifetime of high density printed circuit boards. A homogenization method combining analytical and numerical approaches has been developed in order to obtain the complete elastic orthotropic behaviour of a woven composite from its precise 3D internal structure and its experimentally measured in-plane elastic properties. Since the mechanical properties of the resin surrounding the fibres are unknown, an inverse method is proposed to estimate it. The methodology has been applied to one laminate used in hyperfrequency spatial applications in order to get its elastic orthotropic behaviour at different temperatures in the range [-55°C; +125°C]. Next; numerical simulations of a plated through hole in a double sided PCB are performed. Results show the major importance of the out-of-plane properties and the temperature dependency of these properties on the lifetime of a printed circuit board. Acknowledgements—The support of the French ANR agency through the Labcom program ANR-14-LAB7-0003-01, support of CNES, Thales Alenia Space and Cimulec is acknowledged.

Keywords: homogenization, orthotropic behaviour, printed circuit board, woven composites

Procedia PDF Downloads 187
5110 The Use of Technology in Theatrical Performances as a Tool of Audience’S Engagement

Authors: Chrysoula Bousiouta

Abstract:

Throughout the history of theatre, technology has played an important role both in influencing the relationship between performance and audience and offering different kinds of experiences. The use of technology dates back in ancient times, when the introduction of artifacts, such as “Deus ex machine” in ancient Greek theatre, started. Taking into account the key techniques and experiences used throughout history, this paper investigates how technology, through new media, influences contemporary theatre. In the context of this research, technology is defined as projections, audio environments, video-projections, sensors, tele-connections, all alongside with the performance, challenging audience’s participation. The theoretical framework of the research covers, except for the history of theatre, the theory of “experience economy” that took over the service and goods economy. The research is based on the qualitative and comparative analysis of two case studies, Contact Theatre in Manchester (United Kingdom) and Bios in Athens (Greece). The data selection includes desk research and is complemented with semi structured interviews. Building on the results of the research one could claim that the intended experience of modern/contemporary theatre is that of engagement. In this context, technology -as defined above- plays a leading role in creating it. This experience passes through and exists in the middle of the realms of entertainment, education, estheticism and escapism. Furthermore, it is observed that nowadays, theatre is not only about acting but also about performing; it is that one where the performances are unfinished without the participation of the audience. Both case studies try to achieve the experience of engagement through practices that promote the attraction of attention, the increase of imagination, the interaction, the intimacy and the true activity. These practices are achieved through the script, the scenery, the language and the environment of a performance. Contact and Bios consider technology as an intimate tool in order to accomplish the above, and they make an extended use of it. The research completes a notable record of technological techniques that modern theatres use. The use of technology, inside or outside the limits of film technique’s, helps to rivet the attention of the audience, to make performances enjoyable, to give the sense of the “unfinished” or to be used for things that take place around the spectators and force them to take action, being spect-actors. The advantage of technology is that it can be used as a hook for interaction in all stages of a performance. Further research on the field could involve exploring alternative ways of binding technology and theatre or analyzing how the performance is perceived through the use of technological artifacts.

Keywords: experience of engagement, interactive theatre, modern theatre, performance, technology

Procedia PDF Downloads 236
5109 Data Stream Association Rule Mining with Cloud Computing

Authors: B. Suraj Aravind, M. H. M. Krishna Prasad

Abstract:

There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.

Keywords: data stream, association rule mining, cloud computing, frequent itemsets

Procedia PDF Downloads 486
5108 The Revised Completion of Student Internship Report by Goal Mapping

Authors: Faizah Herman

Abstract:

This study aims to explore the attitudes and behavior of goal mapping performed by the student in completing the internship report revised on time. The approach is phenomenological research with qualitative methods. Data sources include observation, interviews and questionnaires, focus group discussions. Research subject 5 students who have completed the internship report revisions in a timely manner. The analysis technique is an interactive model of Miles&Huberman data analysis techniques. The results showed that the students have a goal of mapping that includes the ultimate goal, formulate goals by identifying what are the things that need to be done, action to be taken and what kind of support is needed from the environment.

Keywords: goal mapping, revision internship report, students, Brawijaya

Procedia PDF Downloads 380
5107 Preparation of Fe, Cr Codoped TiO2 Nanostructure for Phenol Removal from Wastewaters

Authors: N. Nowzari-Dalini, S. Sabbaghi

Abstract:

Phenol is a hazardous material found in many industrial wastewaters. Photocatalytic degradation and furthermore catalyst doping are promising techniques in purpose of effective phenol removal, which have been studied comprehensively in this decade. In this study, Fe, Cr codoped TiO2 were prepared by sol-gel method, and its photocatalytic activity was investigated through degradation of phenol under visible light. The catalyst was characterized by XRD, SEM, FT-IR, BET, and EDX. The results showed that nanoparticles possess anatase phase, and the average size of nanoparticles was about 21 nm. Also, photocatalyst has significant surface area. Effect of experimental parameters such as pH, irradiation time, pollutant concentration, and catalyst concentration were investigated by using Design-Expert® software. 98% of phenol degradation was achieved after 6h of irradiation.

Keywords: doping, metals, sol-gel, titanium dioxide, wastewater

Procedia PDF Downloads 314
5106 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis

Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri

Abstract:

In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.

Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer

Procedia PDF Downloads 69
5105 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment

Authors: Arindam Chaudhuri

Abstract:

Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.

Keywords: FRSVM, Hadoop, MapReduce, PFRSVM

Procedia PDF Downloads 479
5104 Development of Gamma Configuration Stirling Engine Using Polymeric and Metallic Additive Manufacturing for Education

Authors: J. Otegui, M. Agirre, M. A. Cestau, H. Erauskin

Abstract:

The increasing accessibility of mid-priced additive manufacturing (AM) systems offers a chance to incorporate this technology into engineering instruction. Furthermore, AM facilitates the creation of manufacturing designs, enhancing the efficiency of various machines. One example of these machines is the Stirling cycle engine. It encompasses complex thermodynamic machinery, revealing various aspects of mechanical engineering expertise upon closer inspection. In this publication, the application of Stirling Engines fabricated via additive manufacturing techniques will be showcased for the purpose of instructive design and product enhancement. The performance of a Stirling engine's conventional displacer and piston is contrasted. The outcomes of utilizing this instructional tool in teaching are demonstrated.

Keywords: 3D printing, additive manufacturing, mechanical design, stirling engine.

Procedia PDF Downloads 32
5103 Comparative Study of Estimators of Population Means in Two Phase Sampling in the Presence of Non-Response

Authors: Syed Ali Taqi, Muhammad Ismail

Abstract:

A comparative study of estimators of population means in two phase sampling in the presence of non-response when Unknown population means of the auxiliary variable(s) and incomplete information of study variable y as well as of auxiliary variable(s) is made. Three real data sets of University students, hospital and unemployment are used for comparison of all the available techniques in two phase sampling in the presence of non-response with the newly generalized ratio estimators.

Keywords: two-phase sampling, ratio estimator, product estimator, generalized estimators

Procedia PDF Downloads 226
5102 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network

Authors: Gulfam Haider, sana danish

Abstract:

Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.

Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent

Procedia PDF Downloads 103