The search functionality is under construction.

Keyword Search Result

[Keyword] internet(292hit)

1-20hit(292hit)

  • Real-Time Monitoring Systems That Provide M2M Communication between Machines Open Access

    Ya ZHONG  

     
    PAPER-Language, Thought, Knowledge and Intelligence

      Pubricized:
    2023/10/17
      Vol:
    E107-A No:7
      Page(s):
    1019-1026

    Artificial intelligence and the introduction of Internet of Things technologies have benefited from technological advances and new automated computer system technologies. Eventually, it is now possible to integrate them into a single offline industrial system. This is accomplished through machine-to-machine communication, which eliminates the human factor. The purpose of this article is to examine security systems for machine-to-machine communication systems that rely on identification and authentication algorithms for real-time monitoring. The article investigates security methods for quickly resolving data processing issues by using the Security operations Center’s main machine to identify and authenticate devices from 19 different machines. The results indicate that when machines are running offline and performing various tasks, they can be exposed to data leaks and malware attacks by both the individual machine and the system as a whole. The study looks at the operation of 19 computers, 7 of which were subjected to data leakage and malware attacks. AnyLogic software is used to create visual representations of the results using wireless networks and algorithms based on previously processed methods. The W76S is used as a protective element within intelligent sensors due to its built-in memory protection. For 4 machines, the data leakage time with malware attacks was 70 s. For 10 machines, the duration was 150 s with 3 attacks. Machine 15 had the longest attack duration, lasting 190 s and involving 6 malware attacks, while machine 19 had the shortest attack duration, lasting 200 s and involving 7 malware attacks. The highest numbers indicated that attempting to hack a system increased the risk of damaging a device, potentially resulting in the entire system with connected devices failing. Thus, illegal attacks by attackers using malware may be identified over time, and data processing effects can be prevented by intelligent control. The results reveal that applying identification and authentication methods using a protocol increases cyber-physical system security while also allowing real-time monitoring of offline system security.

  • Thermoelectric Effect of Ga-Sn-O Thin Films for Internet-of-Things Application

    Yuhei YAMAMOTO  Naoki SHIBATA  Tokiyoshi MATSUDA  Hidenori KAWANISHI  Mutsumi KIMURA  

     
    BRIEF PAPER-Electronic Materials

      Pubricized:
    2023/07/10
      Vol:
    E107-C No:1
      Page(s):
    18-21

    Thermoelectric effect of Ga-Sn-O (GTO) thin films has been investigated for Internet-of-Things application. It is found that the amorphous GTO thin films provide higher power factors (PF) than the polycrystalline ones, which is because grain boundaries block the electron conduction in the polycrystalline ones. It is also found that the GTO thin films annealed in vacuum provide higher PF than those annealed in air, which is because oxygen vacancies are terminated in those annealed in air. The PF and dimensionless figure of merit (ZT) is not so excellent, but the cost effectiveness is excellent, which is the most important for some examples of the Internet-of-Things application.

  • Content Search Method Utilizing the Metadata Matching Characteristics of Both Spatio-Temporal Content and User Request in the IoT Era

    Shota AKIYOSHI  Yuzo TAENAKA  Kazuya TSUKAMOTO  Myung LEE  

     
    PAPER-Network System

      Pubricized:
    2023/10/06
      Vol:
    E107-B No:1
      Page(s):
    163-172

    Cross-domain data fusion is becoming a key driver in the growth of numerous and diverse applications in the Internet of Things (IoT) era. We have proposed the concept of a new information platform, Geo-Centric Information Platform (GCIP), that enables IoT data fusion based on geolocation, i.e., produces spatio-temporal content (STC), and then provides the STC to users. In this environment, users cannot know in advance “when,” “where,” or “what type” of STC is being generated because the type and timing of STC generation vary dynamically with the diversity of IoT data generated in each geographical area. This makes it difficult to directly search for a specific STC requested by the user using the content identifier (domain name of URI or content name). To solve this problem, a new content discovery method that does not directly specify content identifiers is needed while taking into account (1) spatial and (2) temporal constraints. In our previous study, we proposed a content discovery method that considers only spatial constraints and did not consider temporal constraints. This paper proposes a new content discovery method that matches user requests with content metadata (topic) characteristics while taking into account spatial and temporal constraints. Simulation results show that the proposed method successfully discovers appropriate STC in response to a user request.

  • Nonvolatile Storage Cells Using FiCC for IoT Processors with Intermittent Operations

    Yuki ABE  Kazutoshi KOBAYASHI  Jun SHIOMI  Hiroyuki OCHI  

     
    PAPER

      Pubricized:
    2023/04/13
      Vol:
    E106-C No:10
      Page(s):
    546-555

    Energy harvesting has been widely investigated as a potential solution to supply power for Internet of Things (IoT) devices. Computing devices must operate intermittently rather than continuously, because harvested energy is unstable and some of IoT applications can be periodic. Therefore, processors for IoT devices with intermittent operation must feature a hibernation mode with zero-standby-power in addition to energy-efficient normal mode. In this paper, we describe the layout design and measurement results of a nonvolatile standard cell memory (NV-SCM) and nonvolatile flip-flops (NV-FF) with a nonvolatile memory using Fishbone-in-Cage Capacitor (FiCC) suitable for IoT processors with intermittent operations. They can be fabricated in any conventional CMOS process without any additional mask. NV-SCM and NV-FF are fabricated in a 180nm CMOS process technology. The area overhead by nonvolatility of a bit cell are 74% in NV-SCM and 29% in NV-FF, respectively. We confirmed full functionality of the NV-SCM and NV-FF. The nonvolatile system using proposed NV-SCM and NV-FF can reduce the energy consumption by 24.3% compared to the volatile system when hibernation/normal operation time ratio is 500 as shown in the simulation.

  • Few-Shot Learning-Based Malicious IoT Traffic Detection with Prototypical Graph Neural Networks

    Thin Tharaphe THEIN  Yoshiaki SHIRAISHI  Masakatu MORII  

     
    PAPER

      Pubricized:
    2023/06/22
      Vol:
    E106-D No:9
      Page(s):
    1480-1489

    With a rapidly escalating number of sophisticated cyber-attacks, protecting Internet of Things (IoT) networks against unauthorized activity is a major concern. The detection of malicious attack traffic is thus crucial for IoT security to prevent unwanted traffic. However, existing traditional malicious traffic detection systems which relied on supervised machine learning approach need a considerable number of benign and malware traffic samples to train the machine learning models. Moreover, in the cases of zero-day attacks, only a few labeled traffic samples are accessible for analysis. To deal with this, we propose a few-shot malicious IoT traffic detection system with a prototypical graph neural network. The proposed approach does not require prior knowledge of network payload binaries or network traffic signatures. The model is trained on labeled traffic data and tested to evaluate its ability to detect new types of attacks when only a few labeled traffic samples are available. The proposed detection system first categorizes the network traffic as a bidirectional flow and visualizes the binary traffic flow as a color image. A neural network is then applied to the visualized traffic to extract important features. After that, using the proposed few-shot graph neural network approach, the model is trained on different few-shot tasks to generalize it to new unseen attacks. The proposed model is evaluated on a network traffic dataset consisting of benign traffic and traffic corresponding to six types of attacks. The results revealed that our proposed model achieved an F1 score of 0.91 and 0.94 in 5-shot and 10-shot classification, respectively, and outperformed the baseline models.

  • Intrusion Detection Model of Internet of Things Based on LightGBM Open Access

    Guosheng ZHAO  Yang WANG  Jian WANG  

     
    PAPER-Fundamental Theories for Communications

      Pubricized:
    2023/02/20
      Vol:
    E106-B No:8
      Page(s):
    622-634

    Internet of Things (IoT) devices are widely used in various fields. However, their limited computing resources make them extremely vulnerable and difficult to be effectively protected. Traditional intrusion detection systems (IDS) focus on high accuracy and low false alarm rate (FAR), making them often have too high spatiotemporal complexity to be deployed in IoT devices. In response to the above problems, this paper proposes an intrusion detection model of IoT based on the light gradient boosting machine (LightGBM). Firstly, the one-dimensional convolutional neural network (CNN) is used to extract features from network traffic to reduce the feature dimensions. Then, the LightGBM is used for classification to detect the type of network traffic belongs. The LightGBM is more lightweight on the basis of inheriting the advantages of the gradient boosting tree. The LightGBM has a faster decision tree construction process. Experiments on the TON-IoT and BoT-IoT datasets show that the proposed model has stronger performance and more lightweight than the comparison models. The proposed model can shorten the prediction time by 90.66% and is better than the comparison models in accuracy and other performance metrics. The proposed model has strong detection capability for denial of service (DoS) and distributed denial of service (DDoS) attacks. Experimental results on the testbed built with IoT devices such as Raspberry Pi show that the proposed model can perform effective and real-time intrusion detection on IoT devices.

  • Edge Computing Resource Allocation Algorithm for NB-IoT Based on Deep Reinforcement Learning

    Jiawen CHU  Chunyun PAN  Yafei WANG  Xiang YUN  Xuehua LI  

     
    PAPER-Network

      Pubricized:
    2022/11/04
      Vol:
    E106-B No:5
      Page(s):
    439-447

    Mobile edge computing (MEC) technology guarantees the privacy and security of large-scale data in the Narrowband-IoT (NB-IoT) by deploying MEC servers near base stations to provide sufficient computing, storage, and data processing capacity to meet the delay and energy consumption requirements of NB-IoT terminal equipment. For the NB-IoT MEC system, this paper proposes a resource allocation algorithm based on deep reinforcement learning to optimize the total cost of task offloading and execution. Since the formulated problem is a mixed-integer non-linear programming (MINLP), we cast our problem as a multi-agent distributed deep reinforcement learning (DRL) problem and address it using dueling Q-learning network algorithm. Simulation results show that compared with the deep Q-learning network and the all-local cost and all-offload cost algorithms, the proposed algorithm can effectively guarantee the success rates of task offloading and execution. In addition, when the execution task volume is 200KBit, the total system cost of the proposed algorithm can be reduced by at least 1.3%, and when the execution task volume is 600KBit, the total cost of system execution tasks can be reduced by 16.7% at most.

  • How Many Tweets Describe the Topics on TV Programs: An Investigation on the Relation between Twitter and Mass Media

    Jun IIO  

     
    PAPER

      Pubricized:
    2022/11/11
      Vol:
    E106-D No:4
      Page(s):
    443-449

    As the Internet has become prevalent, the popularity of net media has been growing, to a point that it has taken over conventional mass media. However, TWtrends, the Twitter trends visualization system operated by our research team since 2019, indicates that many topics on TV programs frequently appear on Twitter trendlines. This study investigates the relationship between Twitter and TV programs by collecting information on Twitter trends and TV programs simultaneously. Although this study provides a rough estimation of the volume of tweets that mention TV programs, the results show that several tweets mention TV programs at a constant rate, which tends to increase on the weekend. This tendency of TV-related tweets stems from the audience rating survey results. Considering the study outcome, and the fact that many TV programs introduce topics popular in social media, implies codependency between Internet media (social media) and mass media.

  • APVAS: Reducing the Memory Requirement of AS_PATH Validation by Introducing Aggregate Signatures into BGPsec

    Ouyang JUNJIE  Naoto YANAI  Tatsuya TAKEMURA  Masayuki OKADA  Shingo OKAMURA  Jason Paul CRUZ  

     
    PAPER

      Pubricized:
    2023/01/11
      Vol:
    E106-A No:3
      Page(s):
    170-184

    The BGPsec protocol, which is an extension of the border gateway protocol (BGP) for Internet routing known as BGPsec, uses digital signatures to guarantee the validity of routing information. However, the use of digital signatures in routing information on BGPsec causes a lack of memory in BGP routers, creating a gaping security hole in today's Internet. This problem hinders the practical realization and implementation of BGPsec. In this paper, we present APVAS (AS path validation based on aggregate signatures), a new protocol that reduces the memory consumption of routers running BGPsec when validating paths in routing information. APVAS relies on a novel aggregate signature scheme that compresses individually generated signatures into a single signature. Furthermore, we implement a prototype of APVAS on BIRD Internet Routing Daemon and demonstrate its efficiency on actual BGP connections. Our results show that the routing tables of the routers running BGPsec with APVAS have 20% lower memory consumption than those running the conventional BGPsec. We also confirm the effectiveness of APVAS in the real world by using 800,000 routes, which are equivalent to the full route information on a global scale.

  • Performance Analysis of Mobile Cellular Networks Accommodating Cellular-IoT Communications with Immediate Release of Radio Resources

    Shuya ABE  Go HASEGAWA  Masayuki MURATA  

     
    PAPER-Network

      Pubricized:
    2022/06/20
      Vol:
    E105-B No:12
      Page(s):
    1477-1486

    It is now becoming important for mobile cellular networks to accommodate all kinds of Internet of Things (IoT) communications. However, the contention-based random access and radio resource allocation used in traditional cellular networks, which are optimized mainly for human communications, cannot efficiently handle large-scale IoT communications. For this reason, standardization activities have emerged to serve IoT devices such as Cellular-IoT (C-IoT). However, few studies have been directed at evaluating the performance of C-IoT communications with periodic data transmissions, despite this being a common characteristic of many IoT communications. In this paper, we give the performance analysis results of mobile cellular networks supporting periodic C-IoT communications, focusing on the performance differences between LTE and Narrowband-IoT (NB-IoT) networks. To achieve this, we first construct an analysis model for end-to-end performance of both the control plane and data plane, including random access procedures, radio resource allocation, establishing bearers in the Evolved Packet Core network, and user-data transmissions. In addition, we include the impact of the immediate release of the radio resources proposed in 3GPP. Numerical evaluations show that NB-IoT can support more IoT devices than LTE, up to 8.7 times more, but imposes a significant delay in data transmissions. We also confirm that the immediate release of radio resources increases the network capacity by up to 17.7 times.

  • SDNRCFII: An SDN-Based Reliable Communication Framework for Industrial Internet

    Hequn LI  Die LIU  Jiaxi LU  Hai ZHAO  Jiuqiang XU  

     
    PAPER-Network

      Pubricized:
    2022/05/26
      Vol:
    E105-B No:12
      Page(s):
    1508-1518

    Industrial networks need to provide reliable communication services, usually in a redundant transmission (RT) manner. In the past few years, several device-redundancy-based, layer 2 solutions have been proposed. However, with the evolution of industrial networks to the Industrial Internet, these methods can no longer work properly in the non-redundancy, layer 3 environments. In this paper, an SDN-based reliable communication framework is proposed for the Industrial Internet. It can provide reliable communication guarantees for mission-critical applications while servicing non-critical applications in a best-effort transmission manner. Specifically, it first implements an RT-based reliable communication method using the Industrial Internet's link-redundancy feature. Next, it presents a redundant synchronization mechanism to prevent end systems from receiving duplicate data. Finally, to maximize the number of critical flows in it (an NP-hard problem), two ILP-based routing & scheduling algorithms are also put forward. These two algorithms are optimal (Scheduling with Unconstrained Routing, SUR) and suboptimal (Scheduling with Minimum length Routing, SMR). Numerous simulations are conducted to evaluate its effectiveness. The results show that it can provide reliable, duplicate-free services to end systems. Its reliable communication method performs better than the conventional best-effort transmission method in terms of packet delivery success ratio in layer 3 networks. In addition, its scheduling algorithm, SMR, performs well on the experimental topologies (with average quality of 93% when compared to SUR), and the time overhead is acceptable.

  • 4-Cycle-Start-Up Reference-Clock-Less Digital CDR Utilizing TDC-Based Initial Frequency Error Detection with Frequency Tracking Loop Open Access

    Tetsuya IIZUKA  Meikan CHIN  Toru NAKURA  Kunihiro ASADA  

     
    PAPER

      Pubricized:
    2022/04/11
      Vol:
    E105-C No:10
      Page(s):
    544-551

    This paper proposes a reference-clock-less quick-start-up CDR that resumes from a stand-by state only with a 4-bit preamble utilizing a phase generator with an embedded Time-to-Digital Converter (TDC). The phase generator detects 1-UI time interval by using its internal TDC and works as a self-tunable digitally-controlled delay line. Once the phase generator coarsely tunes the recovered clock period, then the residual time difference is finely tuned by a fine Digital-to-Time Converter (DTC). Since the tuning resolution of the fine DTC is matched by design with the time resolution of the TDC that is used as a phase detector, the fine tuning completes instantaneously. After the initial coarse and fine delay tuning, the feedback loop for frequency tracking is activated in order to improve Consecutive Identical Digits (CID) tolerance of the CDR. By applying the frequency tracking architecture, the proposed CDR achieves more than 100bits of CID tolerance. A prototype implemented in a 65nm bulk CMOS process operates at a 0.9-2.15Gbps continuous rate. It consumes 5.1-8.4mA in its active state and 42μA leakage current in its stand-by state from a 1.0V supply.

  • Sensitivity Enhanced Edge-Cloud Collaborative Trust Evaluation in Social Internet of Things

    Peng YANG  Yu YANG  Puning ZHANG  Dapeng WU  Ruyan WANG  

     
    PAPER-Network Management/Operation

      Pubricized:
    2022/03/22
      Vol:
    E105-B No:9
      Page(s):
    1053-1062

    The integration of social networking concepts into the Internet of Things has led to the Social Internet of Things (SIoT) paradigm, and trust evaluation is essential to secure interaction in SIoT. In SIoT, when resource-constrained nodes respond to unexpected malicious services and malicious recommendations, the trust assessment is prone to be inaccurate, and the existing architecture has the risk of privacy leakage. An edge-cloud collaborative trust evaluation architecture in SIoT is proposed in this paper. Utilize the resource advantages of the cloud and the edge to complete the trust assessment task collaboratively. An evaluation algorithm of relationship closeness between nodes is designed to evaluate neighbor nodes' reliability in SIoT. A trust computing algorithm with enhanced sensitivity is proposed, considering the fluctuation of trust value and the conflict between trust indicators to enhance the sensitivity of identifying malicious behaviors. Simulation results show that compared with traditional methods, the proposed trust evaluation method can effectively improve the success rate of interaction and reduce the false detection rate when dealing with malicious services and malicious recommendations.

  • PRIGM: Partial-Regression-Integrated Generic Model for Synthetic Benchmarks Robust to Sensor Characteristics

    Kyungmin KIM  Jiung SONG  Jong Wook KWAK  

     
    LETTER-Data Engineering, Web Information Systems

      Pubricized:
    2022/04/04
      Vol:
    E105-D No:7
      Page(s):
    1330-1334

    We propose a novel synthetic-benchmarks generation model using partial time-series regression, called Partial-Regression-Integrated Generic Model (PRIGM). PRIGM abstracts the unique characteristics of the input sensor data into generic time-series data confirming the generation similarity and evaluating the correctness of the synthetic benchmarks. The experimental results obtained by the proposed model with its formula verify that PRIGM preserves the time-series characteristics of empirical data in complex time-series data within 10.4% on an average difference in terms of descriptive statistics accuracy.

  • Deep Coalitional Q-Learning for Dynamic Coalition Formation in Edge Computing

    Shiyao DING  Donghui LIN  

     
    PAPER

      Pubricized:
    2021/12/14
      Vol:
    E105-D No:5
      Page(s):
    864-872

    With the high development of computation requirements in Internet of Things, resource-limited edge servers usually require to cooperate to perform the tasks. Most related studies usually assume a static cooperation approach which might not suit the dynamic environment of edge computing. In this paper, we consider a dynamic cooperation approach by guiding edge servers to form coalitions dynamically. It raises two issues: 1) how to guide them to optimally form coalitions and 2) how to cope with the dynamic feature where server statuses dynamically change as the tasks are performed. The coalitional Markov decision process (CMDP) model proposed in our previous work can handle these issues well. However, its basic solution, coalitional Q-learning, cannot handle the large scale problem when the task number is large in edge computing. Our response is to propose a novel algorithm called deep coalitional Q-learning (DCQL) to solve it. To sum up, we first formulate the dynamic cooperation problem of edge servers as a CMDP: each edge server is regarded as an agent and the dynamic process is modeled as a MDP where the agents observe the current state to formulate several coalitions. Each coalition takes an action to impact the environment which correspondingly transfers to the next state to repeat the above process. Then, we propose DCQL which includes a deep neural network and so can well cope with large scale problem. DCQL can guide the edge servers to form coalitions dynamically with the target of optimizing some goal. Furthermore, we run experiments to verify our proposed algorithm's effectiveness in different settings.

  • Multi-Agent Reinforcement Learning for Cooperative Task Offloading in Distributed Edge Cloud Computing

    Shiyao DING  Donghui LIN  

     
    PAPER

      Pubricized:
    2021/12/28
      Vol:
    E105-D No:5
      Page(s):
    936-945

    Distributed edge cloud computing is an important computation infrastructure for Internet of Things (IoT) and its task offloading problem has attracted much attention recently. Most existing work on task offloading in distributed edge cloud computing usually assumes that each self-interested user owns one edge server and chooses whether to execute its tasks locally or to offload the tasks to cloud servers. The goal of each edge server is to maximize its own interest like low delay cost, which corresponds to a non-cooperative setting. However, with the strong development of smart IoT communities such as smart hospital and smart factory, all edge and cloud servers can belong to one organization like a technology company. This corresponds to a cooperative setting where the goal of the organization is to maximize the team interest in the overall edge cloud computing system. In this paper, we consider a new problem called cooperative task offloading where all edge servers try to cooperate to make the entire edge cloud computing system achieve good performance such as low delay cost and low energy cost. However, this problem is hard to solve due to two issues: 1) each edge server status dynamically changes and task arrival is uncertain; 2) each edge server can observe only its own status, which makes it hard to optimize team interest as global information is unavailable. For solving these issues, we formulate the problem as a decentralized partially observable Markov decision process (Dec-POMDP) which can well handle the dynamic features under partial observations. Then, we apply a multi-agent reinforcement learning algorithm called value decomposition network (VDN) and propose a VDN-based task offloading algorithm (VDN-TO) to solve the problem. Specifically, the motivation is that we use a team value function to evaluate the team interest, which is then divided into individual value functions for each edge server. Then, each edge server updates its individual value function in the direction that can maximize the team interest. Finally, we choose a part of a real dataset to evaluate our algorithm and the results show the effectiveness of our algorithm in a comparison with some other existing methods.

  • SDM4IIoT: An SDN-Based Multicast Algorithm for Industrial Internet of Things

    Hequn LI  Jiaxi LU  Jinfa WANG  Hai ZHAO  Jiuqiang XU  Xingchi CHEN  

     
    PAPER-Network

      Pubricized:
    2021/11/11
      Vol:
    E105-B No:5
      Page(s):
    545-556

    Real-time and scalable multicast services are of paramount importance to Industrial Internet of Things (IIoT) applications. To realize these services, the multicast algorithm should, on the one hand, ensure the maximum delay of a multicast session not exceeding its upper delay bound. On the other hand, the algorithm should minimize session costs. As an emerging networking paradigm, Software-defined Networking (SDN) can provide a global view of the network to multicast algorithms, thereby bringing new opportunities for realizing the desired multicast services in IIoT environments. Unfortunately, existing SDN-based multicast (SDM) algorithms cannot meet the real-time and scalable requirements simultaneously. Therefore, in this paper, we focus on SDM algorithm design for IIoT environments. To be specific, the paper first converts the multicast tree construction problem for SDM in IIoT environments into a delay-bounded least-cost shared tree problem and proves that it is an NP-complete problem. Then, the paper puts forward a shared tree (ST) algorithm called SDM4IIoT to compute suboptimal solutions to the problem. The algorithm consists of five steps: 1) construct a delay-optimal shared tree; 2) divide the tree into a set of subpaths and a subtree; 3) optimize the cost of each subpath by relaxing the delay constraint; 4) optimize the subtree cost in the same manner; 5) recombine them into a shared tree. Simulation results show that the algorithm can provide real-time support that other ST algorithms cannot. In addition, it can achieve good scalability. Its cost is only 20.56% higher than the cost-optimal ST algorithm. Furthermore, its computation time is also acceptable. The algorithm can help to realize real-time and scalable multicast services for IIoT applications.

  • Performance Evaluation of Classification and Verification with Quadrant IQ Transition Image

    Hiro TAMURA  Kiyoshi YANAGISAWA  Atsushi SHIRANE  Kenichi OKADA  

     
    PAPER-Network Management/Operation

      Pubricized:
    2021/12/01
      Vol:
    E105-B No:5
      Page(s):
    580-587

    This paper presents a physical layer wireless device identification method that uses a convolutional neural network (CNN) operating on a quadrant IQ transition image. This work introduces classification and detection tasks in one process. The proposed method can identify IoT wireless devices by exploiting their RF fingerprints, a technology to identify wireless devices by using unique variations in analog signals. We propose a quadrant IQ image technique to reduce the size of CNN while maintaining accuracy. The CNN utilizes the IQ transition image, which image processing cut out into four-part. An over-the-air experiment is performed on six Zigbee wireless devices to confirm the proposed identification method's validity. The measurement results demonstrate that the proposed method can achieve 99% accuracy with the light-weight CNN model with 36,500 weight parameters in serial use and 146,000 in parallel use. Furthermore, the proposed threshold algorithm can verify the authenticity using one classifier and achieved 80% accuracy for further secured wireless communication. This work also introduces the identification of expanded signals with SNR between 10 to 30dB. As a result, at SNR values above 20dB, the proposals achieve classification and detection accuracies of 87% and 80%, respectively.

  • RF Signal Frequency Identification in a Direct RF Undersampling Multi-Band Real-Time Spectrum Monitor for Wireless IoT Usage

    Tomoyuki FURUICHI  Mizuki MOTOYOSHI  Suguru KAMEDA  Takashi SHIBA  Noriharu SUEMATSU  

     
    PAPER-Software Defined Radio

      Pubricized:
    2021/10/12
      Vol:
    E105-B No:4
      Page(s):
    461-471

    To reduce the complexity of direct radio frequency (RF) undersampling real-time spectrum monitoring in wireless Internet of Things (IoT) bands (920MHz, 2.4GHz, and 5 GHz bands), a design method of sampling frequencies is proposed in this paper. The Direct RF Undersampling receiver architecture enables the use of ADC with sampling clock lower frequency than receiving RF signal, but it needs RF signal identification signal processing from folded spectrums with multiple sampling clock frequencies. The proposed design method allows fewer sampling frequencies to be used than the conventional design method for continuous frequency range (D.C. to 5GHz-band). The proposed method reduced 2 sampling frequencies in wireless IoT bands case compared with the continuous range. The design result using the proposed method is verified by measurement.

  • Scaling Law of Energy Efficiency in Intelligent Reflecting Surface Enabled Internet of Things Networks

    Juan ZHAO  Wei-Ping ZHU  

     
    LETTER-Communication Theory and Signals

      Pubricized:
    2021/09/29
      Vol:
    E105-A No:4
      Page(s):
    739-742

    The energy efficiency of intelligent reflecting surface (IRS) enabled internet of things (IoT) networks is studied in this letter. The energy efficiency is mathematically expressed, respectively, as the number of reflecting elements and the spectral efficiency of the network and is shown to scale in the logarithm of the reflecting elements number in the high regime of transmit power from source node. Furthermore, it is revealed that the energy efficiency scales linearly over the spectral efficiency in the high regime of transmit power, in contrast to conventional studies on energy and spectral efficiency trade-offs in the non-IRS wireless IoT networks. Numerical simulations are carried out to verify the derived results for the IRS enabled IoT networks.

1-20hit(292hit)