Search results for: cloud theory
4949 The Probability Foundation of Fundamental Theoretical Physics
Authors: Quznetsov Gunn
Abstract:
In the study of the logical foundations of probability theory, it was found that the terms and equations of the fundamental theoretical physics represent terms and theorems of the classical probability theory, more precisely, of that part of this theory, which considers the probability of dot events in the 3 + 1 space-time. In particular, the masses, moments, energies, spins, etc. turn out of parameters of probability distributions such events. The terms and the equations of the electroweak and of the quark-gluon theories turn out the theoretical-probabilistic terms and theorems. Here the relation of a neutrino to his lepton becomes clear, the W and Z bosons masses turn out dynamic ones, the cause of the asymmetry between particles and antiparticles is the impossibility of the birth of single antiparticles. In addition, phenomena such as confinement and asymptotic freedom receive their probabilistic explanation. And here we have the logical foundations of the gravity theory with phenomena dark energy and dark matter.Keywords: classical theory of probability, logical foundation of fundamental theoretical physics, masses, moments, energies, spins
Procedia PDF Downloads 2954948 Learning to Teach on the Cloud: Preservice EFL Teachers’ Online Project-Based Practicum Experience
Authors: Mei-Hui Liu
Abstract:
This paper reports 20 preservice EFL teachers’ learning-to-teach experience when they were engaged in an online project-based practicum implemented on a Cloud Platform. This 10-month study filled in the literature gap by documenting the impact of online project-based instruction on preservice EFL teachers’ professional development. Data analysis showed that the online practicum was regarded as a flexible mechanism offering chances of teaching practices without geographical barriers. Additionally, this project-based practice helped the participants integrate the theories they had learned and further foster them how to create a self-directed online learning environment. Furthermore, these preservice teachers with experiences of technology-enabled practicum showed their motivation to apply technology and online platforms into future instructional practices. Yet, this study uncovered several concerns encountered by these participants during this online field experience. The findings of this study rendered meaning and lessons for teacher educators intending to integrate online practicum into preservice training courses.Keywords: online teaching practicum, project-based learning, teacher preparation, English language education
Procedia PDF Downloads 3714947 A Cloud-Based Federated Identity Management in Europe
Authors: Jesus Carretero, Mario Vasile, Guillermo Izquierdo, Javier Garcia-Blas
Abstract:
Currently, there is a so called ‘identity crisis’ in cybersecurity caused by the substantial security, privacy and usability shortcomings encountered in existing systems for identity management. Federated Identity Management (FIM) could be solution for this crisis, as it is a method that facilitates management of identity processes and policies among collaborating entities without enforcing a global consistency, that is difficult to achieve when there are ID legacy systems. To cope with this problem, the Connecting Europe Facility (CEF) initiative proposed in 2014 a federated solution in anticipation of the adoption of the Regulation (EU) N°910/2014, the so-called eIDAS Regulation. At present, a network of eIDAS Nodes is being deployed at European level to allow that every citizen recognized by a member state is to be recognized within the trust network at European level, enabling the consumption of services in other member states that, until now were not allowed, or whose concession was tedious. This is a very ambitious approach, since it tends to enable cross-border authentication of Member States citizens without the need to unify the authentication method (eID Scheme) of the member state in question. However, this federation is currently managed by member states and it is initially applied only to citizens and public organizations. The goal of this paper is to present the results of a European Project, named eID@Cloud, that focuses on the integration of eID in 5 cloud platforms belonging to authentication service providers of different EU Member States to act as Service Providers (SP) for private entities. We propose an initiative based on a private eID Scheme both for natural and legal persons. The methodology followed in the eID@Cloud project is that each Identity Provider (IdP) is subscribed to an eIDAS Node Connector, requesting for authentication, that is subscribed to an eIDAS Node Proxy Service, issuing authentication assertions. To cope with high loads, load balancing is supported in the eIDAS Node. The eID@Cloud project is still going on, but we already have some important outcomes. First, we have deployed the federation identity nodes and tested it from the security and performance point of view. The pilot prototype has shown the feasibility of deploying this kind of systems, ensuring good performance due to the replication of the eIDAS nodes and the load balance mechanism. Second, our solution avoids the propagation of identity data out of the native domain of the user or entity being identified, which avoids problems well known in cybersecurity due to network interception, man in the middle attack, etc. Last, but not least, this system allows to connect any country or collectivity easily, providing incremental development of the network and avoiding difficult political negotiations to agree on a single authentication format (which would be a major stopper).Keywords: cybersecurity, identity federation, trust, user authentication
Procedia PDF Downloads 1664946 Terrestrial Laser Scans to Assess Aerial LiDAR Data
Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani
Abstract:
The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy
Procedia PDF Downloads 1004945 Research on Urban Design Method of Ancient City Guided by Catalyst Theory
Authors: Wang Zhiwei, Wang Weiwu
Abstract:
The process of urbanization in China has entered a critical period of transformation from urban expansion and construction to delicate urban design, thus forming a new direction in the field of urban design. So far, catalyst theory has become a prominent guiding strategy in urban planning and design. In this paper, under the background of urban renewal, catalyst theory is taken as the guiding ideology to explore the method of urban design in shouxian county. Firstly, this study briefly introduces and analyzes the catalyst theory. Through field investigation, it is found that the city has a large number of idle Spaces, such as abandoned factories and schools. In the design, the idle Spaces in the county town are utilized and interlinked in space, and functional interaction is carried out from the pattern of the county town. On the one hand, the results showed that the catalyst theory can enhance the vitality of the linear street space with a small amount of monomer construction. On the other hand, the city can also increase the cultural and economic sites of the city without damaging the historical relics and the sense of alterations of the ancient city, to improve the quality of life and quality of life of citizens. The city micro-transformation represented by catalyst theory can help ancient cities like shouxian to realize the activation of the old city and realize the gradual development.Keywords: catalytic theory, urban design, China's ancient city, Renaissance
Procedia PDF Downloads 1244944 The Instruction of Imagination: A Theory of Language as a Social Communication Technology
Authors: Daniel Dor
Abstract:
The research presents a new general theory of language as a socially-constructed communication technology, designed by cultural evolution for a very specific function: the instruction of imagination. As opposed to all the other systems of intentional communication, which provide materials for the interlocutors to experience, language allows speakers to instruct their interlocutors in the process of imagining the intended meaning-instead of experiencing it. It is thus the only system that bridges the experiential gaps between speakers. This is the key to its enormous success.Keywords: experience, general theory of language, imagination, language as technology, social essence of language
Procedia PDF Downloads 5864943 An Exploratory Study Applied to Search Relationship between Humans and Universe
Authors: Mohamed Hashelaf, Ahmed Al-Osdody
Abstract:
In this paper, we focused our efforts on one of the vaguest subjects in astrophysics that is the formation and evolution of the universe until the arrival of humans. Through an in-depth exploration of the origins of the universe, understanding what has happened since the Big Bang until now and checking the history of creation, we can answer questions about the future of life, the possibility of its existence elsewhere in the universe and to be able to understand how we came, what our role in the circle of life is and what the future of our development will be. Here is where we used systematic steps that allowed us first and foremost to identify the reason behind the big bang itself that formed a large cloud of cosmic dust. Then after a period of time from the expansion of the universe and its coolness, the initial molecules of gases from the cosmic cloud began to condense, forming a very dense field of gravity that after millions of years led to the formation of stars, galaxies, even earth and the else planets. Finally, it became clear before us that after the earth has formed, the existence of liquid water made it possible for life to form, starting from the bacteria all the way until the appearance of the humans that we know today. But it does not stop here. If we look and contemplate in ourselves as humans, we will understand that the universe is inside us and that’s what makes us exceptional. All of this means that just as life on earth was created, it could have been on other planets as well. It also means that we are the universe’s key to understand itself.Keywords: Big Bang, cosmic dust, primary elements, universe
Procedia PDF Downloads 1344942 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 2144941 A Study on Application of Elastic Theory for Computing Flexural Stresses in Preflex Beam
Authors: Nasiri Ahmadullah, Shimozato Tetsuhiro, Masayuki Tai
Abstract:
This paper presents the step-by-step procedure for using Elastic Theory to calculate the internal stresses in composite bridge girders prestressed by the Preflexing Technology, called Prebeam in Japan and Preflex beam worldwide. Elastic Theory approaches preflex beams the same way as it does the conventional composite girders. Since preflex beam undergoes different stages of construction, calculations are made using different sectional and material properties. Stresses are calculated in every stage using the properties of the specific section. Stress accumulation gives the available stress in a section of interest. Concrete presence in the section implies prestress loss due to creep and shrinkage, however; more work is required to be done in this field. In addition to the graphical presentation of this application, this paper further discusses important notes of graphical comparison between the results of an experimental-only research carried out on a preflex beam, with the results of simulation based on the elastic theory approach, for an identical beam using Finite Element Modeling (FEM) by the author.Keywords: composite girder, Elastic Theory, preflex beam, prestressing
Procedia PDF Downloads 2794940 Using Multiple Intelligences Theory to Develop Thai Language Skill
Authors: Bualak Naksongkaew
Abstract:
The purposes of this study were to compare pre- and post-test achievement of Thai language skills. The samples consisted of 40 tenth grader of Secondary Demonstration School of Suan Sunandha Rajabhat University in the first semester of the academic year 2010. The researcher prepared the Thai lesson plans, the pre- and post-achievement test at the end program. Data analyses were carried out using means, standard deviations and descriptive statistics, independent samples t-test analysis for comparison pre- and post-test. The study showed that there were a statistically significant difference at α= 0.05; therefore the use multiple intelligences theory can develop Thai languages skills. The results after using the multiple intelligences theory for Thai lessons had higher level than standard.Keywords: multiple intelligences theory, Thai language skills, development, pre- and post-test achievement
Procedia PDF Downloads 4254939 A Framework for Investigating Reverse Logistics Capability of E-Tailers
Authors: Wen-Shan Lin, Shu-Lu Hsu
Abstract:
Environmental concern and consumer rights have entailed e-tailers to adopt better strategies to facilitate product returns from customers. As the demand for reverse logistics (RL) continues to grow, little is known about what motivates e-tailers to enhance their RL capabilities and about the role RL capabilities plays in enabling e-tailers to achieve better customer satisfaction and economic performance. Based on resource-based theory and institutional theory, this article proposes that the following factors play a critical role in influencing the RL capability of e-tailers: (a) Financial resource commitment to RL, (b) managerial resource commitment to RL, and (c) institutional pressure to implement RL. Based on the role of these factors, the study provides a framework and propositions that serve to guide future research addressing the link among resources, institutional pressure, and RL capability.Keywords: reverse logistics, e-tailing, resource-based theory, institutional theory
Procedia PDF Downloads 4494938 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 634937 Large Eddy Simulation of Particle Clouds Using Open-Source CFD
Authors: Ruo-Qian Wang
Abstract:
Open-source CFD has become increasingly popular and promising. The recent progress in multiphase flow enables new CFD applications, which provides an economic and flexible research tool for complex flow problems. Our numerical study using four-way coupling Euler-Lagrangian Large-Eddy Simulations to resolve particle cloud dynamics with OpenFOAM and CFDEM will be introduced: The fractioned Navier-Stokes equations are numerically solved for fluid phase motion, solid phase motion is addressed by Lagrangian tracking for every single particle, and total momentum is conserved by fluid-solid inter-phase coupling. The grid convergence test was performed, which proves the current resolution of the mesh is appropriate. Then, we validated the code by comparing numerical results with experiments in terms of particle cloud settlement and growth. A good comparison was obtained showing reliability of the present numerical schemes. The time and height at phase separations were defined and analyzed for a variety of initial release conditions. Empirical formulas were drawn to fit the results.Keywords: four-way coupling, dredging, land reclamation, multiphase flows, oil spill
Procedia PDF Downloads 4294936 Working Capital Management and Profitability of Uk Firms: A Contingency Theory Approach
Authors: Ishmael Tingbani
Abstract:
This paper adopts a contingency theory approach to investigate the relationship between working capital management and profitability using data of 225 listed British firms on the London Stock Exchange for the period 2001-2011. The paper employs a panel data analysis on a series of interactive models to estimate this relationship. The findings of the study confirm the relevance of the contingency theory. Evidence from the study suggests that the impact of working capital management on profitability varies and is constrained by organizational contingencies (environment, resources, and management factors) of the firm. These findings have implications for a more balanced and nuanced view of working capital management policy for policy-makers.Keywords: working capital management, profitability, contingency theory approach, interactive models
Procedia PDF Downloads 3464935 Particle Observation in Secondary School Using a Student-Built Instrument: Design-Based Research on a STEM Sequence about Particle Physics
Authors: J.Pozuelo-Muñoz, E. Cascarosa-Salillas, C. Rodríguez-Casals, A. de Echave, E. Terrado-Sieso
Abstract:
This study focuses on the development, implementation, and evaluation of an instructional sequence aimed at 16–17-year-old students, involving the design and use of a cloud chamber—a device that allows observation of subatomic particles. The research addresses the limited presence of particle physics in Spanish secondary and high school curricula, a gap that restricts students' learning of advanced physics concepts and diminishes engagement with complex scientific topics. The primary goal of this project is to introduce particle physics in the classroom through a practical, interdisciplinary methodology that promotes autonomous learning and critical thinking. The methodology is framed within Design-Based Research (DBR), an approach that enables iterative and pragmatic development of educational resources. The research proceeded in several phases, beginning with the design of an experimental teaching sequence, followed by its implementation in high school classrooms. This sequence was evaluated, redesigned, and reimplemented with the aim of enhancing students’ understanding and skills related to designing and using particle detection instruments. The instructional sequence was divided into four stages: introduction to the activity, research and design of cloud chamber prototypes, observation of particle tracks, and analysis of collected data. In the initial stage, students were introduced to the fundamentals of the activity and provided with bibliographic resources to conduct autonomous research on cloud chamber functioning principles. During the design stage, students sourced materials and constructed their own prototypes, stimulating creativity and understanding of physics concepts like thermodynamics and material properties. The third stage focused on observing subatomic particles, where students recorded and analyzed the tracks generated in their chambers. Finally, critical reflection was encouraged regarding the instrument's operation and the nature of the particles observed. The results show that designing the cloud chamber motivates students and actively engages them in the learning process. Additionally, the use of this device introduces advanced scientific topics beyond particle physics, promoting a broader understanding of science. The study’s conclusions emphasize the need to provide students with ample time and space to thoroughly understand the role of materials and physical conditions in the functioning of their prototypes and to encourage critical analysis of the obtained data. This project not only highlights the importance of interdisciplinarity in science education but also provides a practical framework for teachers to adapt complex concepts for educational contexts where these topics are often absent.Keywords: cloud chamber, particle physics, secondary education, instructional design, design-based research, STEM
Procedia PDF Downloads 134934 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection
Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa
Abstract:
Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.Keywords: classification, airborne LiDAR, parameters selection, support vector machine
Procedia PDF Downloads 1474933 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing
Authors: Tolulope Aremu
Abstract:
This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving
Procedia PDF Downloads 294932 Eclectic Therapy in Approach to Clients’ Problems and Application of Multiple Intelligence Theory
Authors: Mohamed Sharof Mostafa, Atefeh Ahmadi
Abstract:
Most of traditional single modality psychotherapy and counselling approaches to clients’ problems are based on the application of one therapy in all sessions. Modern developments in these sciences focus on eclectic and integrative interventions to consider all dimensions of an issue and all characteristics of the clients. This paper presents and overview eclectic therapy and its pros and cons. In addition, multiple intelligence theory and its application in eclectic therapy approaches are mentioned.Keywords: eclectic therapy, client, multiple intelligence theory, dimensions
Procedia PDF Downloads 7104931 The Truth about Good and Evil: A Mixed-Methods Approach to Color Theory
Authors: Raniya Alsharif
Abstract:
The color theory of good and evil is the association of colors to the omnipresent concept of good and evil, where human behavior and perception can be highly influenced by seeing black and white, making these connotations almost dangerously distinctive where they can be very hard to distinguish. This theory is a human construct that dates back to ancient Egypt and has been used since then in almost all forms of communication and expression, such as art, fashion, literature, and religious manuscripts, helping the implantation of preconceived ideas that influence behavior and society. This is a mixed-methods research that uses both surveys to collect quantitative data related to the theory and a vignette to collect qualitative data by using a scenario where participants aged between 18-25 will style two characters of good and bad characteristics with color contrasting clothes, both yielding results about the nature of the preconceived perceptions associated with ‘black and white’ and ‘good and evil’, illustrating the important role of media and communications in human behavior and subconscious, and also uncover how far this theory goes in the age of social media enlightenment.Keywords: color perception, interpretivism, thematic analysis, vignettes
Procedia PDF Downloads 1254930 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application
Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro
Abstract:
This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.Keywords: item response theory, dimensionality, submodel theory, factorial analysis
Procedia PDF Downloads 3724929 Conformal Invariance and F(R,T) Gravity
Authors: P. Y. Tsyba, O. V. Razina, E. Güdekli, R. Myrzakulov
Abstract:
In this paper, we consider the equation of motion for the F(R,T) gravity on their property of conformal invariance. It is shown that in the general case such a theory is not conformally invariant. Special cases for the functions v and u, in which the properties of the theory can appear, were studied.Keywords: conformal invariance, gravity, space-time, metric
Procedia PDF Downloads 6614928 A Psychoanalytical Approach to Edgar A. Poe’s Short Story ‘The Tell-Tale Heart’
Authors: José Antonio Núñez
Abstract:
Sigmund Freud’s Theory of Psychoanalysis was a groundbreaking contribution to the province of the human psyche and behavior. Nowadays, psychoanalytic theory is applied to numerous fields. One of them is literature. Literary criticism has put into practice the basis of Freud’s idea to analyze literary works. This essay is about the analysis of Edgar A. Poe’s short story ‘The Tell-Tale Heart,’ under the lens of Freud’s psychoanalytical perspective. In 1919, it was published ‘Das Unheimliche’ (The Uncanny) by Freud. On this article, the famous Austrian psychoanalyst showed his explanations about what he called ‘the uncanny,’ and its relation to the human unconscious. In this paper, Freud’s famous article has been used to analyze Poe’s short story ‘The Tell-Tale Heart,’ and to find the analogies that exist between Poe’s macabre short story and Freud’s theory of ‘the uncanny.’Keywords: psychoanalysis, theory of the unconscious, the uncanny, unheimlich
Procedia PDF Downloads 6444927 Language Learning, Drives and Context: A Grounded Theory of Learning Behavior
Authors: Julian Pigott
Abstract:
This paper introduces the Language Learning as a Means of Drive Engagement (LLMDE) theory, derived from a grounded theory analysis of interviews with Japanese university students. According to LLMDE theory, language learning can be understood as a means of engaging one or more of four self-fulfillment drives: the drive to expand one’s horizons (perspective drive); the drive to make a success of oneself (status drive); the drive to engage in interaction with others (communication drive); and the drive to obtain intellectual and affective stimulation (entertainment drive). While many theories of learner psychology focus on conscious agency, LLMDE theory addresses the role of the unconscious. In addition, supplementary thematic analysis of the data revealed the role of context in mediating drive engagement. Unexpected memorable events, for example, play a key role in instigating and, indirectly, in regulating learning, as do institutional and cultural contexts. Given the apparent importance of such factors beyond the immediate control of the learner, and given the pervasive role of habit and drives, it is argued that the concept of motivation merits theoretical reappraisal. Rather than an underlying force determining language learning success or failure, it can be understood to emerge sporadically in consciousness to promote behavioral change, or to protect habitual behavior from disruption.Keywords: drives, grounded theory, motivation, significant events
Procedia PDF Downloads 1484926 The Fifth Political Theory and Countering Terrorism in the Post 9/11 Era
Authors: Rana Eijaz Ahmad
Abstract:
This paper is going to explain about the Fifth Political Theory that challenges all existing three plus one (Capitalism, Marxism and Fascism + Fourth Political Theory) theories. It says, ‘it is human ambiance evolve any political system to survive instead of borrowing other imported thoughts to live in a specific environment, in which Legitimacy leads to authority and promotes humanism.’ According to this theory, no other state is allowed to dictate or install any political system upon other states. It is the born right of individuals to choose a political system or a set of values that are going to make their structures and functions efficient enough to support the system harmony and counter the negative forces successfully. In the post 9/11 era, it is observed that all existing theories like Capitalism, Marxism, Fascism and Fourth Political Theory remained unsuccessful in resolving the global crisis. The so-called war against terrorism is proved as a war for terrorism and creates a vacuum on the global stage, worsening the crisis. The fifth political theory is an answer to counter terrorism in the twenty-first century. It calls for accountability of the United Nations for its failure in sustaining peace at global level. Therefore, the UN charter is supposed to be implemented in its true letter and spirit. All independent sovereign states have right to evolve their own system to carry out a political system that suits them best for sustaining harmony at home. This is the only way to counter terrorism. This paper is comprised of mixed method. Qualitative, quantitative and comparative methods will be used along with secondary sources. The objective of this paper is to create knowledge for the benefit of human beings with a logical and rational argument. It will help political scientists and scholars in conflict management and countering terrorism on pragmatic grounds.Keywords: capitalism, fourth political theory, fifth political theory, Marxism, fascism
Procedia PDF Downloads 3804925 Living the Religious of the Virgin Mary (RVM) Educational Mission: A Grounded Theory Approach
Authors: Violeta Juanico
Abstract:
While there was a statement made by the RVM Education Ministry Commission that its strength is its Ignacian identity, shaped by the Ignacian spirituality that permeates the school community leading to a more defined RVM school culture, there has been no empirical study made in terms of a clear and convincing conceptual framework on how the RVM Educational mission is lived in the Religious of the Virgin Mary (RVM) learning institutions to the best of author’s knowledge. This dissertation is an attempt to come up with a substantive theory that supports and explains the stakeholders’ experiences with the RVM educational mission in the Philippines. Participants that represent the different stakeholders ranging from students to administrators were interviewed. The expressions and thoughts of the participants were initially coded and analyzed using the Barney Glaser’s original grounded theory methodology to find out how the RVM mission is lived in the field of education.Keywords: catholic education, grounded theory, lived experience, RVM educational mission
Procedia PDF Downloads 4684924 Exploring the Quest for Centralized Identity in Mohsin Hamid's "The Last White Man": Post-Apocalyptic Transformations and Societal Reconfigurations
Authors: Kashifa Khalid, Eesham Fatima
Abstract:
This study aims to analyze the loss of identity and its impact on one’s life in ‘The Last White Man’ by Mohsin Hamid. The theory of Alienation Effect by Bertolt Brecht has been applied to the text as Hamid offers the readers a unique perspective, alluding to significant themes like identity, race, and death. The aspects of defamiliarization align impeccably with the plot, as existence and the corresponding concept of identity seem to have dissolved into utter chaos. This extends from the unexplained transformation to the way the entire world unravels from its general norm into a dystopian mayhem. The characters, starting with the protagonist Anders, have lost their center. One’s own self transforms into the ‘other,’ and the struggle is to get refamiliarized with one’s own self. Alienation and isolation only rise as the construct of race and identity is taken apart brick by brick, ironically at its own pace as many new realities are blown to bits. The inseparable relationship between identity and grief under the ever-looming cloud of ‘death’ is studied in detail. The theoretical framework and thematic aspects harmonize in accordance with the writing style put forth by Hamid, tying all the loose ends together.Keywords: alienation, chaos, identity, transformation
Procedia PDF Downloads 444923 Blended Cloud Based Learning Approach in Information Technology Skills Training and Paperless Assessment: Case Study of University of Cape Coast
Authors: David Ofosu-Hamilton, John K. E. Edumadze
Abstract:
Universities have come to recognize the role Information and Communication Technology (ICT) skills plays in the daily activities of tertiary students. The ability to use ICT – essentially, computers and their diverse applications – are important resources that influence an individual’s economic and social participation and human capital development. Our society now increasingly relies on the Internet, and the Cloud as a means to communicate and disseminate information. The educated individual should, therefore, be able to use ICT to create and share knowledge that will improve society. It is, therefore, important that universities require incoming students to demonstrate a level of computer proficiency or trained to do so at a minimal cost by deploying advanced educational technologies. The training and standardized assessment of all in-coming first-year students of the University of Cape Coast in Information Technology Skills (ITS) have become a necessity as students’ most often than not highly overestimate their digital skill and digital ignorance is costly to any economy. The one-semester course is targeted at fresh students and aimed at enhancing the productivity and software skills of students. In this respect, emphasis is placed on skills that will enable students to be proficient in using Microsoft Office and Google Apps for Education for their academic work and future professional work whiles using emerging digital multimedia technologies in a safe, ethical, responsible, and legal manner. The course is delivered in blended mode - online and self-paced (student centered) using Alison’s free cloud-based tutorial (Moodle) of Microsoft Office videos. Online support is provided via discussion forums on the University’s Moodle platform and tutor-directed and assisted at the ICT Centre and Google E-learning laboratory. All students are required to register for the ITS course during either the first or second semester of the first year and must participate and complete it within a semester. Assessment focuses on Alison online assessment on Microsoft Office, Alison online assessment on ALISON ABC IT, Peer assessment on e-portfolio created using Google Apps/Office 365 and an End of Semester’s online assessment at the ICT Centre whenever the student was ready in the cause of the semester. This paper, therefore, focuses on the digital culture approach of hybrid teaching, learning and paperless examinations and the possible adoption by other courses or programs at the University of Cape Coast.Keywords: assessment, blended, cloud, paperless
Procedia PDF Downloads 2484922 Chern-Simons Equation in Financial Theory and Time-Series Analysis
Authors: Ognjen Vukovic
Abstract:
Chern-Simons equation represents the cornerstone of quantum physics. The question that is often asked is if the aforementioned equation can be successfully applied to the interaction in international financial markets. By analysing the time series in financial theory, it is proved that Chern-Simons equation can be successfully applied to financial time-series. The aforementioned statement is based on one important premise and that is that the financial time series follow the fractional Brownian motion. All variants of Chern-Simons equation and theory are applied and analysed. Financial theory time series movement is, firstly, topologically analysed. The main idea is that exchange rate represents two-dimensional projections of three-dimensional Brownian motion movement. Main principles of knot theory and topology are applied to financial time series and setting is created so the Chern-Simons equation can be applied. As Chern-Simons equation is based on small particles, it is multiplied by the magnifying factor to mimic the real world movement. Afterwards, the following equation is optimised using Solver. The equation is applied to n financial time series in order to see if it can capture the interaction between financial time series and consequently explain it. The aforementioned equation represents a novel approach to financial time series analysis and hopefully it will direct further research.Keywords: Brownian motion, Chern-Simons theory, financial time series, econophysics
Procedia PDF Downloads 4734921 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture
Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán
Abstract:
Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing
Procedia PDF Downloads 934920 Factors Drive Consumers to Purchase Digital Music: An Empirical Study
Authors: Chechen Liao, Yi-Jen Huang, Yu-Ting Lu
Abstract:
This study explores and complements digital aspects. In this study, we construct a research model based on the theory of reasoned action and extend it with the advantages and disadvantages of intangibility (convenience, perceived risk), some characteristics of digital products (price, variety, trialability), and factors related to entertainment (perceived playfulness) to predict what consumers really consider when they buy digital music. Eight hypotheses were tested and supported. Finally, we prove that the theory of reasoned action is still valid in the field of digital products.Keywords: digital music, digital product, theory of reasoned action
Procedia PDF Downloads 441